To Lock or Not to Lock?

Slides from IGNIS Webinar (4/26/18)

Last fall, I was working as a part-time librarian at a community college when a student approached me at the reference desk and asked where she could take her test. She explained that she needed to download a software called Respondus that would lock down her browser, and she needed to use a webcam because she would be recorded during the test. It was through that interaction that I first learned about the complicated world of online exam proctoring.

Now that I’ve had some time to think about this topic a little more (and get over my initial shock), I’ve come to the conclusion that the decision to secure online assessments is part of the instructional design process. I’ve read some interesting pieces, suggested below, which made me think about the various factors that instructors need to weigh before deciding to use online exam security.

Faculty who teach online can’t ignore the imperative to ensure that the student who is taking the online course is the one completing the work. This is specified in the 2008 Higher Education Opportunity Act (HEOA). However, HEOA does not specify exactly how programs must verify identity, and it suggests a range of options including secure logins, proctoring, and other technology.

Some courses don’t use exams at all. However, in courses that do rely heavily on exams, especially for large portions of the student’s overall grade, the instructor may feel strongly that they need to verify the student’s identity and ensure that the student is not receiving any outside help during the exam. Online proctoring software may include features that require students to show photo ID and limit the student’s computer capabilities (by locking down the browser). Some companies offer live remote proctoring where the student is monitored via webcam to ensure the student is following the rules of the exam.

In institutions where faculty do not have access to proctoring software (all of the companies offering these services are costly), instructors may add features to their exams to make it more difficult for students to cheat. This could include randomizing answers, timing questions, showing only one question at a time, preventing students from reviewing correct answers, and limiting the time window during which the exam can be taken.

At the same time, making online assessments more difficult to take can hinder accessibility and equity. Randomizing answers can slow down learners who depend on a consistent pattern of answers in order to answer questions successfully. It’s also recommended that faculty shorten the time window that tests are available to prevent collaboration and the sharing of answers. This sounds good in theory, but can create burdens on students may need to shift work/childcare schedules in order to take a test during a limited time frame.

There are many ways to discourage academic dishonesty in online courses without using proctoring (which is costly to the institution and still does not completely prevent cheating). I think the first step is to look at the course’s learning outcomes and determine if any of the outcomes can be assessed without using simple multiple choice exams. Moving away from exams to other kinds of assessments, including group work, portfolios, problem-based assignments, essays, and reflective writing, can give a clearer picture of students’ learning.

If exams must be used, it is best to limit their value (e.g., make them worth less of a student’s overall grade). Another option is to use automatically-graded multiple choice tests as simple knowledge checks that unlock the next module, but do not have an impact on the learner’s grade.

When I think back to the scenario of the student who needed somewhere to take her exam, the real head-scratcher for me is that she was not taking a fully-online course. She was taking a face-to-face course and, obviously, since she was standing in front of me, she was a student who came to campus regularly. She asked me this question very early in the quarter, far too early to be taking any kind of summative assessment like a midterm or final exam. So why did her instructor feel it was necessary to use online proctoring software? If it was a low-value quiz, couldn’t it have been taken in person, during class? I don’t know the answers to these questions, but these are the kinds of conversations I’d like to have with faculty to better understand their thoughts about securing assessments.

Suggested Reading

Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Learning, 21(1), 146-161.

Higher Learning Commission. (2018, January). Federal compliance overview. Retrieved from http://download.hlcommission.org/FedCompOverview_PRC.pdf

Michael, T.B., & Williams, M.A. (2013). Student equity: Discouraging cheating in online courses. Administrative Issues 3(2). Retrieved from https://dc.swosu.edu/aij/vol3/iss2/6

Schaffhauser, D. (2016, April 6). How students try to bamboozle online proctors. Campus Technology. Retrieved from https://campustechnology.com/articles/2016/04/06/how-students-try-to-bamboozle-online-proctors.aspx

Smith, C. & Noviello, S. (2012, September). Best practices in authentication and verification of students in online education [Presentation]. Retrieved from http://hdl.handle.net/10755/243374

Stuber-McEwen, D., Wiseley, P., & Hoggatt, S. (2009). Point, click, and cheat: Frequency and type of academic dishonesty in the virtual classroom. Online Journal of Distance Learning Administration, 12(3), 1-10.

Watson, G. & Sottile, J. (2010). Cheating in the digital age: Do students cheat more in online courses?. Online Journal of Distance Learning Administration, 13(1). Retrieved from https://www.westga.edu/~distance/ojdla/spring131/watson131.html

 

 

Featured image by Mike Szczepanski on Unsplash

Have we confused surveillance with assessment of student learning?

Somehow I had been blissfully unaware of Respondus Lockdown Browser until last week, when several students came to the library asking if we had this software available on our computers. If you’re not familiar with this product, Respondus is one of several LMS-integrated cheating-prevention tools. In simple terms, it shuts down a student’s Internet browser while they are taking a test in an online class environment, such as Canvas or Blackboard. One of the students who asked about Respondus said something that raised the hair on the back of my neck.

“I need a webcam,” they said. “I have to take the quiz with my webcam on, and there can’t be any movement in the background.”

What the hell? I thought. What are they talking about?

Recording Students During Online Tests

After doing some digging through an e-mail chain, I found a message from the campus eLearning Administrator with instructions for students taking tests with Respondus.

You will be required to use LockDown Browser with a webcam which will record you while you are taking the three module tests. Your computer must have a functioning webcam and microphone. A broadband connection is also required.

  • You will first need to review and agree to the Terms of Use.
  • The Webcam Check will confirm that your webcam and microphone are working properly. The first time the Webcam Check is performed on a computer, Adobe Flash Player will require you to select Allow and Remember.
  • Next you will be asked to take a picture of yourself.
  • After that, you will be required to show and take a picture of a government issued ID such as a driver’s license with your picture clearly displayed. If you don’t have a driver’s license, you can use your student ID card with your picture clearly displayed.
  • Click “Start Recording” and slowly tilt/pan your webcam so a brief video can be made of the area around your computer. Make sure you show your desk, what’s underneath your desk, and a panorama of the room you are in.  (If the webcam is built into the monitor, move it around as best you can to show the areas described.)

As a librarian who cares deeply about student privacy, all of this makes me want to throw up. If I understand this correctly, students must:

  • Accept Terms of Use (which I couldn’t find on the Respondus website, so I’m not sure what, exactly, students are agreeing to)
  • Take a picture of themselves
  • Share their government-issued ID (which would include their date of birth, address, height, weight, and other personal details)
  • Share whatever is in visible around their desk and workspace which, if they’re at home, could include any number of extremely personal items.

Can we agree that asking a student to show “what’s underneath your desk” is particularly perverse?

But the benefits of this invasive procedure, according to Respondus, are numerous—easy to integrate with existing learning platforms, money saved on printing costs, increased efficiency, superior confidence in the accuracy of test results, and so on.

Beyond privacy, what are some other concerns? After some brief searching, I found a presentation from 2012 where two researchers at Central Washington University found that Respondus was incredibly easy to manipulate to steal student data—hopefully this has changed. The following year, the same presenter, Donald Moncrief, gave a follow up presentation about the exact methodology they used (which they withheld the previous year, probably to prevent folks from following their steps).

My outrage is a little delayed. Respondus has been in business for ten years. Their website boasts that their software is used to proctor 50 million exams annually and they work with 2,000 institutions in 50 different countries. But here I am, angry as ever, concerned that educators have gotten carried away with a technology without considering its implications. And, as usual, my gripe is about assessment.

What are we really measuring?

Respondus offers regular training webinars for instructors. Here are the outcomes for an upcoming webinar:

Each training will cover, from the instructor perspective:

  • How to use LockDown Browser to prevent digital cheating in proctored testing environments
  • How use Respondus Monitor in non-proctored environments, to protect exam integrity and confirm student identity
  • How Respondus Monitor provides greater flexibility for when and where tests are taken
  • Efficient review of the assessment data collected, including student videos
  • Best practices and tips for success with both applications
  • A chance to ask questions

I am particularly confused by the portion in bold (my emphasis added). How is the surveillance data collected considered assessment data? Isn’t the assessment data the actual test results (e.g., whether or not students could meet the learning outcomes of the quiz or test)? I suppose if you saw clear evidence of academic dishonesty in the surveillance data (for example, the student had the textbook open on their desk but it was a “no book” test), then it would invalidate the assessment results, but it would not be the assessment data itself.

Maybe they’re just using “assessment” in an inaccurate way. Maybe it’s not a big deal. But I’m inclined to believe the word “assessment” has a particular meaning about student learning, and most accrediting bodies would agree.

Accreditation and surveillance

Colleges and universities almost never lose accreditation over facilities. You can educate students in a cornfield, in a portable building, in a yurt without running water or electricity—provided you have assessment data that shows that student learning outcomes were met for the program. You can’t award degrees without assessment data. You have to show that your students learned something. Seems reasonable, no?

So here’s my worry. Are we confusing surveillance with assessment data? Do we think that recording students during exams will appease accreditors? “Look, see! They didn’t cheat. They answered all of these test questions, and they got good scores.”

I understand the occasional need for a controlled testing environment, especially in high-stakes exam situations for professional certification (I’m think of the NCLEX for nurses, for example). I don’t understand controlled testing for formative assessment, especially for short quizzes in a first-year general education course. Even in a completely online course, I’m not sure I see the value in putting students through surveillance measures for quick knowledge checks of essential facts. When it comes to summative assessment of your course’s essential learning outcomes, couldn’t you meet the learning outcomes some other way that prevented simple cheating? What possibilities might open up if you invited your students to deeply process the material, connect to it in their own way, and show you the meaning they’ve made from it?

I think that there is no greater indication of an instructor’s values than how they spend time in a classroom. If what you truly value is assessing student learning in a tightly-controlled, surveilled environment—why not just take the quiz in a computer lab classroom where you can watch all students at once?

Is surveillance necessary for accreditation of online degrees?

My first answer to this question is, I’m not sure, and I’d like to learn more about this. I know that some fully online programs require students to take exams at proctored testing sites (e.g., by using a campus testing center at a nearby college or university). This practice is held up to accrediting agencies as proof of the program’s commitment to academic honesty. Of course, there is some healthy skepticism about this. In a 2011 article about online exam procedures, researchers suggested that requiring a once-per-semester proctored exam was a “a token effort to ensure academic honesty.”

I took a quick glance through the Western Association of Schools and Colleges (WASC) Postsecondary Accreditation Manual and I couldn’t find the word proctor anywhere in the document. Or the word cheat or the phrase academic honesty (the word honesty is used—to describe the governance procedures of the institution). While it is important to demonstrate student learning outcomes are being met through valid means (e.g., institutions need some reasonable assurance that students are doing their own work), I could not find evidence that this accrediting body specifically requires proof of proctoring or cheating-prevention. Does anyone know if other accrediting standards indicate otherwise?

Sources

Cluskey Jr, G. R., Ehlen, C. R., & Raiborn, M. H. (2011). Thwarting online exam cheating without proctor supervision. Journal of Academic and Business Ethics4, 1-7.

Moncrief, D., & Foster, R. (2012). Well that was easy: Misdirecting Respondus Lockdown Browser for fun and profit. Retrieved from http://digitalcommons.cwu.edu/source/2012/oralpresentations/18/

Moncrief, D. (2013). Respondus LockDown Browser revisited: Disclosure. Retrieved from http://digitalcommons.cwu.edu/source/2013/oralpresentations/73/

Postsecondary Accreditation Manual. (2013).Western Association of Schools and Colleges. Retrieved from http://www.acswasc.org

Respondus Lockdown Browser. (2017). Retrieved from https://www.respondus.com

Featured image courtesy of Pixabay.com.