Libraries & learning analytics: A brief history.

March 5, 2018 — Revised and updated from the original post on November 10, 2017.

 

A slide deck from EDUCAUSE made the rounds on Twitter last week, with many folks expressing shock about libraries & their involvement (complicity) in learning analytics efforts on higher education campuses. But this isn’t new. Academic librarians have been talking about using library data to prove library value for quite a while. Over the past decade, the conversation has been held hostage by one particular professor who has made proving library value the exclusive focus of her scholarly research agenda.

As the old saying goes, if you’re not pissed off, you haven’t been paying attention.

To me, these are some of the significant milestones in the conversation about libraries and their involvement in learning analytics. (Emphasis on “to me” — your timeline might look a bit different!)

2010
Megan Oakleaf, LIS professor at Syracuse University, publishes the Value of Academic Libraries Report, which was commissioned by ACRL. The report suggests that libraries should track individual student behavior to demonstrate correlations between library use and institutional outcomes, such as retention.

2011
Value of Academic Libraries committee is formed by ACRL Executive Committee.

2012
ACRL is awarded a $249,330 grant from IMLS to fund Assessment in Action: Academic Libraries and Student Success.

2013 – 2016
ACRL runs three 1-year cohorts of AiA projects. Assessment in Action aims to teach academic librarians how to collaborate with other stakeholders on their campuses to measure the library’s impact on student success. According to the AiA website: “The projects will result in a variety of approaches to assessing library impact on student learning which will be documented and disseminated for use by the wider academic library and higher education communities.”

Spring 2014
Oakleaf teaches IST 600 “Academic Libraries: Value, Impact & ROI” at Syracuse University for the first time.

October 2014
Bell publishes “Keeping Up With… Learning Analytics” on the ALA website.

August 2014
Margie Jantti presents “Unlocking Value from Your Library’s Data” at the Library Assessment Conference. The presentation highlights how, among other metrics, the University of Wollongong correlated student performance with number of hours of using the library’s electronic resources.

December 2014
Lisa Hinchliffe and Andrew Asher present “Analytics and Privacy: A Proposed Framework for Negotiating Service and Value Boundaries” at the Coalition for Networked Information Fall Membership Meeting.

March 2015
Oakleaf publishes “The Library’s Contribution to Student Learning: Inspirations and Aspirations” in College & Research Libraries.

2016
Jantti and Heath publish “What Role for Libraries in Learning Analytics?” in Performance Measurement and Metrics. The article describes how the integrated existing library analytics and student data (from the “Library Cube”) with institutional learning analytics efforts at the University of Wollongong.

June 2016
College and Research Libraries News declares learning analytics one of the top trends in academic libraries.

July 2016
Oakleaf publishes “Getting Ready & Getting Started: Academic Librarian Involvement in Institutional Learning Analytics Initiatives” in The Journal of Academic Librarianship.

I present “Can we demonstrate library value without violating user privacy?” at Colorado Academic Library Association Workshop in Denver.

2017
Oakleaf secures nearly $100,000 in grant funding from IMLS for “Library Integration in Institutional Learning Analytics (LIILA)“. The full proposal can be read here.

January 2017
ACRL Board discusses “patron privacy” and if, as a core value, it conflicts with support of learning analytics. The minutes record: “Confidentiality/Privacy is in ALA’s core values, and the Board agreed that patron privacy does not need to conflict with learning analytics, as student research can still be confidential.”

Also at Midwinter 2017,  ACRL Board approves Institutional Research as an interest group to incorporate interest in Learning Analytics (but, notably, the Board did not want to name the group the “Learning Analytics” interest group). ACRL Board formally adopts the Proficiencies for Assessment Librarians and Coordinators which makes frequent reference to using learning analytics.

March 2017
Oakleaf et al present “Data in the Library is Safe, But That’s Not What Data is Meant For” at ACRL 2017 in Baltimore, Maryland.

April 2017
Kyle M.L. Jones and Dorothea Salo’s article, “Learning Analytics and the Academic Library: Professional Ethics Commitments at a Crossroads“, is available as a preprint from College & Research Libraries.

June 2017
Value of Academic Libraries committee meets at ALA Annual. The minutes reflect that VAL wants to distance itself from learning analytics, now that they have their own interest group.

September 2017
ACRL publishes Academic Library Impact, which explicitly advocates for working with stakeholders to “statistically analyze and predict student learning and success based on shared analytics”.

October 2017
Karen Nicholson presents her paper, “The ‘Value Agenda’: Negotiating a Path Between Compliance and Critical Practice“, at the Canadian Library Assessment Workshop in Victoria, British Columbia.

November 2017
Oakleaf et al present “Closing the Data Gap: Integrating Library Data into Institutional Learning Analytics” at EDUCAUSE 2017 in Philadelphia. The presentation seems to advocate feeding individual patron data into campus-wide learning analytics dashboards so that other campus administrators, faculty, and advisors can see student interactions with the library.

Emily Drabinski asks, “How do we change the table?” In her blog post, she wonders how organizing can help librarians build power to make change. “We need to reject learning analytics,” she declares.

Penny Beile, Associate Director of Research, Education, and Engagement at the University of Central Florida Libraries, publishes “The Academic Library’s (Potential) Contribution to the Learning Analytics Landscape” on the EDUCAUSE blog.

January 2018
April Hathcock responds to the ongoing learning analytics conversation with her own blog post about learning agency. Regarding the need to collaborate with students rather than simply surveil them, she writes, “Essentially, it’s the difference between exploiting a community to study and report on them versus collaborating with that community in studying their needs. It is the very essence of feminist research methods, rooted in an ethic of care, trust, and collaborative empowerment.”

March 2018
Community college librarian Meredith Farkas questions the value of learning analytics in her column in American Libraries.

Kyle M.L. Jones and Ellen LeClere publish “Contextual Expectations and Emerging Informational Harms: A Primer on Academic Library Participation in Learning Analytics Initiatives” in Applying Library Values to Emerging Technology: Decision-Making in the Age of Open Access, Maker Spaces, and the Ever-Changing Library.

April 2018
The Call for Proposals for the special issue of Library Trends about learning analytics and the academic library closes April 1. The issue will be published in March 2019.

Featured image by Lukas Blazek on Unsplash

Have we confused surveillance with assessment of student learning?

Somehow I had been blissfully unaware of Respondus Lockdown Browser until last week, when several students came to the library asking if we had this software available on our computers. If you’re not familiar with this product, Respondus is one of several LMS-integrated cheating-prevention tools. In simple terms, it shuts down a student’s Internet browser while they are taking a test in an online class environment, such as Canvas or Blackboard. One of the students who asked about Respondus said something that raised the hair on the back of my neck.

“I need a webcam,” they said. “I have to take the quiz with my webcam on, and there can’t be any movement in the background.”

What the hell? I thought. What are they talking about?

Recording Students During Online Tests

After doing some digging through an e-mail chain, I found a message from the campus eLearning Administrator with instructions for students taking tests with Respondus.

You will be required to use LockDown Browser with a webcam which will record you while you are taking the three module tests. Your computer must have a functioning webcam and microphone. A broadband connection is also required.

  • You will first need to review and agree to the Terms of Use.
  • The Webcam Check will confirm that your webcam and microphone are working properly. The first time the Webcam Check is performed on a computer, Adobe Flash Player will require you to select Allow and Remember.
  • Next you will be asked to take a picture of yourself.
  • After that, you will be required to show and take a picture of a government issued ID such as a driver’s license with your picture clearly displayed. If you don’t have a driver’s license, you can use your student ID card with your picture clearly displayed.
  • Click “Start Recording” and slowly tilt/pan your webcam so a brief video can be made of the area around your computer. Make sure you show your desk, what’s underneath your desk, and a panorama of the room you are in.  (If the webcam is built into the monitor, move it around as best you can to show the areas described.)

As a librarian who cares deeply about student privacy, all of this makes me want to throw up. If I understand this correctly, students must:

  • Accept Terms of Use (which I couldn’t find on the Respondus website, so I’m not sure what, exactly, students are agreeing to)
  • Take a picture of themselves
  • Share their government-issued ID (which would include their date of birth, address, height, weight, and other personal details)
  • Share whatever is in visible around their desk and workspace which, if they’re at home, could include any number of extremely personal items.

Can we agree that asking a student to show “what’s underneath your desk” is particularly perverse?

But the benefits of this invasive procedure, according to Respondus, are numerous—easy to integrate with existing learning platforms, money saved on printing costs, increased efficiency, superior confidence in the accuracy of test results, and so on.

Beyond privacy, what are some other concerns? After some brief searching, I found a presentation from 2012 where two researchers at Central Washington University found that Respondus was incredibly easy to manipulate to steal student data—hopefully this has changed. The following year, the same presenter, Donald Moncrief, gave a follow up presentation about the exact methodology they used (which they withheld the previous year, probably to prevent folks from following their steps).

My outrage is a little delayed. Respondus has been in business for ten years. Their website boasts that their software is used to proctor 50 million exams annually and they work with 2,000 institutions in 50 different countries. But here I am, angry as ever, concerned that educators have gotten carried away with a technology without considering its implications. And, as usual, my gripe is about assessment.

What are we really measuring?

Respondus offers regular training webinars for instructors. Here are the outcomes for an upcoming webinar:

Each training will cover, from the instructor perspective:

  • How to use LockDown Browser to prevent digital cheating in proctored testing environments
  • How use Respondus Monitor in non-proctored environments, to protect exam integrity and confirm student identity
  • How Respondus Monitor provides greater flexibility for when and where tests are taken
  • Efficient review of the assessment data collected, including student videos
  • Best practices and tips for success with both applications
  • A chance to ask questions

I am particularly confused by the portion in bold (my emphasis added). How is the surveillance data collected considered assessment data? Isn’t the assessment data the actual test results (e.g., whether or not students could meet the learning outcomes of the quiz or test)? I suppose if you saw clear evidence of academic dishonesty in the surveillance data (for example, the student had the textbook open on their desk but it was a “no book” test), then it would invalidate the assessment results, but it would not be the assessment data itself.

Maybe they’re just using “assessment” in an inaccurate way. Maybe it’s not a big deal. But I’m inclined to believe the word “assessment” has a particular meaning about student learning, and most accrediting bodies would agree.

Accreditation and surveillance

Colleges and universities almost never lose accreditation over facilities. You can educate students in a cornfield, in a portable building, in a yurt without running water or electricity—provided you have assessment data that shows that student learning outcomes were met for the program. You can’t award degrees without assessment data. You have to show that your students learned something. Seems reasonable, no?

So here’s my worry. Are we confusing surveillance with assessment data? Do we think that recording students during exams will appease accreditors? “Look, see! They didn’t cheat. They answered all of these test questions, and they got good scores.”

I understand the occasional need for a controlled testing environment, especially in high-stakes exam situations for professional certification (I’m think of the NCLEX for nurses, for example). I don’t understand controlled testing for formative assessment, especially for short quizzes in a first-year general education course. Even in a completely online course, I’m not sure I see the value in putting students through surveillance measures for quick knowledge checks of essential facts. When it comes to summative assessment of your course’s essential learning outcomes, couldn’t you meet the learning outcomes some other way that prevented simple cheating? What possibilities might open up if you invited your students to deeply process the material, connect to it in their own way, and show you the meaning they’ve made from it?

I think that there is no greater indication of an instructor’s values than how they spend time in a classroom. If what you truly value is assessing student learning in a tightly-controlled, surveilled environment—why not just take the quiz in a computer lab classroom where you can watch all students at once?

Is surveillance necessary for accreditation of online degrees?

My first answer to this question is, I’m not sure, and I’d like to learn more about this. I know that some fully online programs require students to take exams at proctored testing sites (e.g., by using a campus testing center at a nearby college or university). This practice is held up to accrediting agencies as proof of the program’s commitment to academic honesty. Of course, there is some healthy skepticism about this. In a 2011 article about online exam procedures, researchers suggested that requiring a once-per-semester proctored exam was a “a token effort to ensure academic honesty.”

I took a quick glance through the Western Association of Schools and Colleges (WASC) Postsecondary Accreditation Manual and I couldn’t find the word proctor anywhere in the document. Or the word cheat or the phrase academic honesty (the word honesty is used—to describe the governance procedures of the institution). While it is important to demonstrate student learning outcomes are being met through valid means (e.g., institutions need some reasonable assurance that students are doing their own work), I could not find evidence that this accrediting body specifically requires proof of proctoring or cheating-prevention. Does anyone know if other accrediting standards indicate otherwise?

Sources

Cluskey Jr, G. R., Ehlen, C. R., & Raiborn, M. H. (2011). Thwarting online exam cheating without proctor supervision. Journal of Academic and Business Ethics4, 1-7.

Moncrief, D., & Foster, R. (2012). Well that was easy: Misdirecting Respondus Lockdown Browser for fun and profit. Retrieved from http://digitalcommons.cwu.edu/source/2012/oralpresentations/18/

Moncrief, D. (2013). Respondus LockDown Browser revisited: Disclosure. Retrieved from http://digitalcommons.cwu.edu/source/2013/oralpresentations/73/

Postsecondary Accreditation Manual. (2013).Western Association of Schools and Colleges. Retrieved from http://www.acswasc.org

Respondus Lockdown Browser. (2017). Retrieved from https://www.respondus.com

Featured image courtesy of Pixabay.com.