Libraries & learning analytics: A brief history.

A slide deck from EDUCAUSE made the rounds on Twitter last week, with many folks expressing shock about libraries & their involvement (complicity) in learning analytics efforts on higher education campuses. But this isn’t new. Academic librarians have been talking about using library data to prove library value for quite a while. Over the past decade, the conversation has been held hostage by one particular professor who has made proving library value the exclusive focus of her scholarly research agenda.

As the old saying goes, if you’re not pissed off, you haven’t been paying attention.

To me, these are some of the significant milestones in the conversation about libraries and their involvement in learning analytics. Can you think of others?

2010
Megan Oakleaf, LIS professor at Syracuse University, publishes the Value of Academic Libraries Report, which was commissioned by ACRL. The report suggests that libraries should track individual student behavior to demonstrate correlations between library use and institutional outcomes, such as retention.

2011
Value of Academic Libraries committee is formed by ACRL Executive Committee.

2012
ACRL is awarded a $249,330 grant from IMLS to fund Assessment in Action: Academic Libraries and Student Success.

2013 – 2016
ACRL runs three 1-year cohorts of AiA projects. Assessment in Action aims to teach academic librarians how to collaborate with other stakeholders on their campuses to measure the library’s impact on student success. According to the AiA website: “The projects will result in a variety of approaches to assessing library impact on student learning which will be documented and disseminated for use by the wider academic library and higher education communities.”

Spring 2014
Oakleaf teaches IST 600 “Academic Libraries: Value, Impact & ROI” at Syracuse University for the first time.

October 2014
Bell publishes “Keeping Up With… Learning Analytics” on the ALA website.

March 2015
Oakleaf publishes “The Library’s Contribution to Student Learning: Inspirations and Aspirations” in College & Research Libraries.

June 2016
College and Research Libraries News declares learning analytics one of the top trends in academic libraries.

July 2016
Oakleaf publishes “Getting Ready & Getting Started: Academic Librarian Involvement in Institutional Learning Analytics Initiatives” in The Journal of Academic Librarianship.

I present “Can we demonstrate library value without violating user privacy?” at Colorado Academic Library Association Workshop in Denver.

2017
Oakleaf secures nearly $100,000 in grant funding from IMLS for “Library Integration in Institutional Learning Analytics (LIILA)“. The full proposal can be read here.

January 2017
ACRL Board discusses “patron privacy” and if, as a core value, it conflicts with support of learning analytics. The minutes record: “Confidentiality/Privacy is in ALA’s core values, and the Board agreed that patron privacy does not need to conflict with learning analytics, as student research can still be confidential.”

Also at Midwinter 2017,  ACRL Board approves Institutional Research as an interest group to incorporate interest in Learning Analytics (but, notably, the Board did not want to name the group the “Learning Analytics” interest group).

Also at Midwinter 2017, ACRL Board formally adopts the Proficiencies for Assessment Librarians and Coordinators which makes frequent reference to using learning analytics.

March 2017
Oakleaf et al present “Data in the Library is Safe, But That’s Not What Data is Meant For” at ACRL 2017 in Baltimore, Maryland.

April 2017
Kyle M.L. Jones and Dorothea Salo’s article, “Learning Analytics and the Academic Library: Professional Ethics Commitments at a Crossroads“, is available as a preprint from College & Research Libraries.

June 2017
Value of Academic Libraries committee meets at ALA Annual. The minutes reflect that VAL wants to distance itself from learning analytics, now that they have their own interest group.

September 2017
ACRL publishes Academic Library Impact, which explicitly advocates for working with stakeholders to “statistically analyze and predict student learning and success based on shared analytics”.

October 2017
Karen Nicholson presents her paper, “The ‘Value Agenda’: Negotiating a Path Between Compliance and Critical Practice“, at the Canadian Library Assessment Workshop in Victoria, British Columbia.

November 2017
Oakleaf et al present “Closing the Data Gap: Integrating Library Data into Institutional Learning Analytics” at EDUCAUSE 2017 in Philadelphia. The presentation seems to advocate feeding individual patron data into campus-wide learning analytics dashboards so that other campus administrators, faculty, and advisors can see student interactions with the library.

Emily Drabinski asks, “How do we change the table?” In her blog post, she wonders how organizing can help librarians build power to make change. “We need to reject learning analytics,” she declares.

Steven Bell writes that analytics are “on the rise” in his “From the Bell Tower” column in Library Journal.

 

Featured image by Lukas Blazek on Unsplash

ACRLPNW 2017

Did you know the ACRL Oregon/Washington joint conference has been held annually since 1981? A little history lesson from University of Puget Sound Science Liaison Librarian Eli Gandour-Rood, ACRL Washington chapter President:

I am happy to share that some digging into our respective chapter archives revealed that the Oregon ACRL chapter, started in 1975, held its first two day conference at Menucha in 1980, followed by the first joint conference in 1981 with the newly-formed Washington chapter (founded in 1980). All records indicate that the two chapters have been holding joint conferences in alternating years ever since; the first meeting at Pack Forest appears to have occurred in 1983.

(Received via e-mail, 25 October 2017)

My favorites from this year’s 37th (!) #acrlpnw at Pack Forest in Eatonville, WA:

Favorite session: “Contemplative Pedagogy: An Ancient Solution to a Modern Problem” with Heather Newcomer (Olympic College) and Nicole Gustavsen (UW Bothell/Cascadia College). Heather and Nicole reminded me about the importance of breathing. Their session illustrated that a 1-minute breathing exercise at the beginning of an instruction session can help students feel centered and focused. I also loved learning about the Contemplative Practices Tree.

Close second: “Built to Last: Integrating OER into Your Library’s Framework” with Candice Watkins and Jennifer Snoek-Brown (Tacoma Community College). Candice and Jennifer highlighted how much labor goes into OER work, and how the Library can be a role model for other faculty on campus for integrating open practices (right down to adding open licenses to the work that librarians create).

Favorite poster: “Revealing and Concealing Information: Arising Tensions in Using Geoinformation Services for Academic Research” with Leah Airt (Seattle Pacific University). I am really excited about Leah’s research which looks at the practical and ethical implications of using Google Street View in lieu of direct observation in research, especially in the study of gentrification, disaster recovery, and urban planning.

Close second: Penelope Wood presented a poster about team-building across Library departments at UW Bothell/Cascadia College through sharing communal lunches. The unique feature of this program was that folks across departments prepared lunch for each other—rather than each person bringing their own brown bag lunch, one person made lunch for two other coworkers and brought enough to share. Feeding one another brought people closer!

Also really great: “Just in Time Assessment: Flexible peer observation during classroom instruction” by Laura Dimmit, Caitlan Maxwell, and Chelsea Nesvig (UW Bothell/Cascadia College).

Favorite mealtime conversation: Sitting across from Amy Hofer at dinner on Thursday night, I asked her how to respond to librarians whose only OER outreach is pushing resources from the Library’s collections. She shrugged. “It’s not OER,” she said. “But it’s still a good thing.”

Favorite format: The fail talks! These were quick, seven-minute lightning talks about failure. Topics included technological failure in information literacy instruction (made meta by slides not loading during the talk), assessment mishaps, student advisory groups disbanding, and the dangers of trying to get student feedback using rolling white boards.

Favorite panel that I moderated:Changing Tides: Exploring Current Trends in Information Literacy Programs” with Lizzie Brown (CWU Ellensberg), Ryan Randall (College of Western Idaho), Dani Rowland (UW Bothell/Cascadia College), and Megan Smithling (Cornish College of the Arts). These four folks graciously agreed to discuss the information literacy programs on their campuses, and their answers highlighted the varying approaches to integrating information literacy in different contexts.

You can find more information about the fabulous sessions at the ACRL WA & OR 2017 Joint Conference Program website.

Disclaimer: As of October 2017, I am the new ACRL Washington chapter Web Manager, replacing Nicholas Schiller. These views reflect my own personal opinions and are not intended to represent the ACRL Washington chapter Board in any capacity. I would also like to clarify that I was not involved in the selection of sessions or the planning of the 2017 conference.

Observe, Reflect, Learn: Developing a Peer Teaching Observation Program in Your Library

This post corresponds with my presentation at the Canadian Library Assessment Workshop on Friday, October 27, 2017 in Victoria, British Columbia.

Slides: https://docs.google.com/presentation/d/1Stx0qmaKZRM4SIGqoH9N3dqUMzNMCmHzJfuX8YYhr7M/edit?usp=sharing

 

Scenario – Leave No Trace

You are the Assessment Librarian at a large university with a team of a dozen instruction librarians. Everyone is excited to embark on a new peer observation program–except Barbara. She’s had problems with the Dean in the past and is convinced that the Dean will use the observation process to terminate her. She agrees to participate in the observation program–as long as there is no record of her observation.

How do you proceed?

 

 

Scenario – No News for the Newbie

You are a new instruction librarian at a small college with an established peer observation program. The observation process just consists of a simple checklist that faculty fill out and file with the Library Director. Your observer is Terry, an instruction librarian who has been at the library for 30 years and will retire in the spring. He shows up to your class 10 minutes after it starts, and submits the observation checklist to your Director without letting you see it first.

How do you proceed?

 

References

Alabi, J. & Weare, W. H., Jr. (2014). Criticism is not a four-letter word: Best practices for constructive feedback in the peer review of teaching. LOEX Conference Proceedings 2012. 141-145.

Bandy, J. (2017). Peer review of teaching. Vanderbuilt University. Retrieved from https://cft.vanderbilt.edu/guides-sub-pages/peer-review-of-teaching/

Cosh, J. (1998). Peer observation in higher education: A reflective approach. Innovations in Education and Training International, 35(2), 171-176.

Centre for Teaching Support & Innovation. (2017). Peer observation of teaching: Effective practices. Toronto, ON: Centre for Teaching Support & Innovation, University of Toronto. Retrieved from  http://teaching.utoronto.ca/teaching-support/peer-observation-of-teaching/

Classroom/teaching observations. North Alberta Institute of Technology. Retrieved from   http://www.nait.ca/docs/Resource_Module_for_Observations.pdf

Davis, K. D. (2007). The academic librarian as instructor: A study of teacher anxiety. College & Undergraduate Libraries, 14(2), 77-101.

Elmendorf, D. C., & Song, L. (2015). Developing indicators for a classroom observation tool on pedagogy and technology integration: A Delphi study. Computers in the Schools, 32(1), 1-19.

England, J., Hutchings, P., & McKeachie, W. J. (1996). The professional evaluation of teaching. American Council of Learned Societies. Occasional Paper No. 33. Retrieved from http://archives.acls.org/op/33_Professonal_Evaluation_of_Teaching.htm

Fielden, N. (2010). Follow the rubric road: Assessing the librarian instructor. LOEX Conference Proceedings. Retrieved from http://commons.emich.edu/cgi/viewcontent.cgi?article=1026&context=loexconf2010

Franchini, B. (2014). Maximizing the benefits of peer observation. Rochester Institute of Technology. Retrieved from http://www.rit.edu/academicaffairs/facultydevelopment/sites/rit.edu.academicaffairs.facultydevelopment/files//images/MaximizingBenefitsofPeerObservation.pdf

Goosney, J. L., Smith, B., & Gordon, S. (2014). Reflective peer mentoring: Evolution of a professional development program for academic librarians. Partnership: The Canadian Journal of Library and Information Practice and Research, 9(1), 1-24.

Kilcullen, M. (1998). Teaching librarians to teach: Recommendations on what we need to know. Reference Services Review, 26(2), 7-18.

Qualities of an effective peer classroom observation. (2017). Center for Teaching Excellence of the University of Virginia. Retrieved from http://cte.virginia.edu/qualities-of-an-effective-peer-classroom-observation/

Samson, S., & McCrea, D. E. (2008). Using peer review to foster good teaching. Reference Services Review, 36(1), 61-70.

Saunders, L. (2015). Education for instruction: A review of LIS instruction syllabi. The Reference Librarian, 56(1), 1-21.

Snavely, L., & Dewald, N. (2011). Developing and implementing peer review of academic librarians’ teaching: an overview and case report. The Journal of Academic Librarianship, 37(4), 343-351.

Sproles, C., Johnson, A. M., & Farison, L. (2008). What the teachers are teaching: How MLIS programs are preparing academic librarians for instructional roles. Journal of Education for Library and Information Science, 195-209.

Van Note Chism, N. (2007). Peer review of teaching: A sourcebook. San Francisco: Anker Publishing.

Walter, S. (2006). Instructional improvement: Building capacity for the professional development of librarians as teachers. Reference & User Services Quarterly, 45(3), 213-218.

 

Have we confused surveillance with assessment of student learning?

Somehow I had been blissfully unaware of Respondus Lockdown Browser until last week, when several students came to the library asking if we had this software available on our computers. If you’re not familiar with this product, Respondus is one of several LMS-integrated cheating-prevention tools. In simple terms, it shuts down a student’s Internet browser while they are taking a test in an online class environment, such as Canvas or Blackboard. One of the students who asked about Respondus said something that raised the hair on the back of my neck.

“I need a webcam,” they said. “I have to take the quiz with my webcam on, and there can’t be any movement in the background.”

What the hell? I thought. What are they talking about?

Recording Students During Online Tests

After doing some digging through an e-mail chain, I found a message from the campus eLearning Administrator with instructions for students taking tests with Respondus.

You will be required to use LockDown Browser with a webcam which will record you while you are taking the three module tests. Your computer must have a functioning webcam and microphone. A broadband connection is also required.

  • You will first need to review and agree to the Terms of Use.
  • The Webcam Check will confirm that your webcam and microphone are working properly. The first time the Webcam Check is performed on a computer, Adobe Flash Player will require you to select Allow and Remember.
  • Next you will be asked to take a picture of yourself.
  • After that, you will be required to show and take a picture of a government issued ID such as a driver’s license with your picture clearly displayed. If you don’t have a driver’s license, you can use your student ID card with your picture clearly displayed.
  • Click “Start Recording” and slowly tilt/pan your webcam so a brief video can be made of the area around your computer. Make sure you show your desk, what’s underneath your desk, and a panorama of the room you are in.  (If the webcam is built into the monitor, move it around as best you can to show the areas described.)

As a librarian who cares deeply about student privacy, all of this makes me want to throw up. If I understand this correctly, students must:

  • Accept Terms of Use (which I couldn’t find on the Respondus website, so I’m not sure what, exactly, students are agreeing to)
  • Take a picture of themselves
  • Share their government-issued ID (which would include their date of birth, address, height, weight, and other personal details)
  • Share whatever is in visible around their desk and workspace which, if they’re at home, could include any number of extremely personal items.

Can we agree that asking a student to show “what’s underneath your desk” is particularly perverse?

But the benefits of this invasive procedure, according to Respondus, are numerous—easy to integrate with existing learning platforms, money saved on printing costs, increased efficiency, superior confidence in the accuracy of test results, and so on.

Beyond privacy, what are some other concerns? After some brief searching, I found a presentation from 2012 where two researchers at Central Washington University found that Respondus was incredibly easy to manipulate to steal student data—hopefully this has changed. The following year, the same presenter, Donald Moncrief, gave a follow up presentation about the exact methodology they used (which they withheld the previous year, probably to prevent folks from following their steps).

My outrage is a little delayed. Respondus has been in business for ten years. Their website boasts that their software is used to proctor 50 million exams annually and they work with 2,000 institutions in 50 different countries. But here I am, angry as ever, concerned that educators have gotten carried away with a technology without considering its implications. And, as usual, my gripe is about assessment.

What are we really measuring?

Respondus offers regular training webinars for instructors. Here are the outcomes for an upcoming webinar:

Each training will cover, from the instructor perspective:

  • How to use LockDown Browser to prevent digital cheating in proctored testing environments
  • How use Respondus Monitor in non-proctored environments, to protect exam integrity and confirm student identity
  • How Respondus Monitor provides greater flexibility for when and where tests are taken
  • Efficient review of the assessment data collected, including student videos
  • Best practices and tips for success with both applications
  • A chance to ask questions

I am particularly confused by the portion in bold (my emphasis added). How is the surveillance data collected considered assessment data? Isn’t the assessment data the actual test results (e.g., whether or not students could meet the learning outcomes of the quiz or test)? I suppose if you saw clear evidence of academic dishonesty in the surveillance data (for example, the student had the textbook open on their desk but it was a “no book” test), then it would invalidate the assessment results, but it would not be the assessment data itself.

Maybe they’re just using “assessment” in an inaccurate way. Maybe it’s not a big deal. But I’m inclined to believe the word “assessment” has a particular meaning about student learning, and most accrediting bodies would agree.

Accreditation and surveillance

Colleges and universities almost never lose accreditation over facilities. You can educate students in a cornfield, in a portable building, in a yurt without running water or electricity—provided you have assessment data that shows that student learning outcomes were met for the program. You can’t award degrees without assessment data. You have to show that your students learned something. Seems reasonable, no?

So here’s my worry. Are we confusing surveillance with assessment data? Do we think that recording students during exams will appease accreditors? “Look, see! They didn’t cheat. They answered all of these test questions, and they got good scores.”

I understand the occasional need for a controlled testing environment, especially in high-stakes exam situations for professional certification (I’m think of the NCLEX for nurses, for example). I don’t understand controlled testing for formative assessment, especially for short quizzes in a first-year general education course. Even in a completely online course, I’m not sure I see the value in putting students through surveillance measures for quick knowledge checks of essential facts. When it comes to summative assessment of your course’s essential learning outcomes, couldn’t you meet the learning outcomes some other way that prevented simple cheating? What possibilities might open up if you invited your students to deeply process the material, connect to it in their own way, and show you the meaning they’ve made from it?

I think that there is no greater indication of an instructor’s values than how they spend time in a classroom. If what you truly value is assessing student learning in a tightly-controlled, surveilled environment—why not just take the quiz in a computer lab classroom where you can watch all students at once?

Is surveillance necessary for accreditation of online degrees?

My first answer to this question is, I’m not sure, and I’d like to learn more about this. I know that some fully online programs require students to take exams at proctored testing sites (e.g., by using a campus testing center at a nearby college or university). This practice is held up to accrediting agencies as proof of the program’s commitment to academic honesty. Of course, there is some healthy skepticism about this. In a 2011 article about online exam procedures, researchers suggested that requiring a once-per-semester proctored exam was a “a token effort to ensure academic honesty.”

I took a quick glance through the Western Association of Schools and Colleges (WASC) Postsecondary Accreditation Manual and I couldn’t find the word proctor anywhere in the document. Or the word cheat or the phrase academic honesty (the word honesty is used—to describe the governance procedures of the institution). While it is important to demonstrate student learning outcomes are being met through valid means (e.g., institutions need some reasonable assurance that students are doing their own work), I could not find evidence that this accrediting body specifically requires proof of proctoring or cheating-prevention. Does anyone know if other accrediting standards indicate otherwise?

Sources

Cluskey Jr, G. R., Ehlen, C. R., & Raiborn, M. H. (2011). Thwarting online exam cheating without proctor supervision. Journal of Academic and Business Ethics4, 1-7.

Moncrief, D., & Foster, R. (2012). Well that was easy: Misdirecting Respondus Lockdown Browser for fun and profit. Retrieved from http://digitalcommons.cwu.edu/source/2012/oralpresentations/18/

Moncrief, D. (2013). Respondus LockDown Browser revisited: Disclosure. Retrieved from http://digitalcommons.cwu.edu/source/2013/oralpresentations/73/

Postsecondary Accreditation Manual. (2013).Western Association of Schools and Colleges. Retrieved from http://www.acswasc.org

Respondus Lockdown Browser. (2017). Retrieved from https://www.respondus.com

Featured image courtesy of Pixabay.com.

On social media.

I don’t know why I stay.

Every day, I check Facebook and Twitter. Usually multiple times a day. And every time, I read something heinous that turns my stomach or makes me anxious.

Why do I stay? How do I justify continuing to give my most valuable commodity — my data, my ideas, my words, my photos — to Jack and Zuck, with all the terror they’ve endorsed?

Jack has never protected his users. Ever. He has chosen his business over human lives, every time. He chose data over GamerGate, over Pepe, over our President’s daily threats of nuclear war. Women and people of color are continuously harassed and stalked on his platform. He shrugs. Calls it free speech. Puts it back on the victim. Report. Block. Mute. Maybe it’s all in your head.

And Zuck? He sold ads to Russians, and those ads were shared and liked endlessly by our own family members, you know, the aunt or the grandparent that you hope you won’t have to sit next to at the holidays. You unfollowed their posts on Facebook so you don’t have to see the Breitbart posts they share.

In such a short amount of time, Jack and Zuck have made the Internet a terrifying place for women and for people of color. And yet we stay. Why?

It feels hard to justify.

It feels hard to justify my Gmail account, knowing that Google is reading every word that comes and goes from my inbox, and that Google uses that information to feed an algorithm they keep secret. But I know it’s the same algorithm that told Dylann Roof what he wanted to hear before he murdered Clementa Pinckney, Cynthia Hurd, Depayne Middleton-Doctor, Sharonda Coleman-Singleton, Susie Jackson, Myra Thompson, Tywanza Sanders, Ethel Lance, and Daniel Simmons.

And the sick part is–and this is what really makes me angry–I tell myself the dumbest things. I say things to myself like, “This never would have happened if librarians were in charge.” Because I know it’s bullshit. I hold librarians on some higher pedestal, blindly believing some nonsense about how good we are, how our well-meaning (and generally socialist) ideals would have kept people safe from their worst selves. I know it’s true that libraries are in a continuous budget crisis, but still, I think, we never would have sold ads to the Russians!

These things are also true: Melvil Dewey was a rampant sexist. White librarians kept people with dark skin out of libraries until forced to integrate. The President of the American Library Association was quick to announce her willingness to work with the new administration. Some librarians think we should do outreach to the KKK. Others think we should just “be neutral”, as though neutrality is attainable. We are advised to develop strategies for dealing with “difficult patrons” (librarian-lingo for people living in poverty, or with mental illness, or without housing, or anything else that makes them “difficult”). Until very recently we still organized books about undocumented immigration using the pejorative phrase “illegal aliens” and, when it was finally changed, at least one librarian made it well-known that they couldn’t care less. The whiteness of librarianship, especially in colleges and universities, is oppressive and unyielding. So the idea that librarians could have somehow done better, that Twitter would be a better space if we had designed it, or that Facebook wouldn’t be full of racist memes if we curated the content as well as we curate our collections–well, all of that is a symptom of my own white wishful thinking.

I don’t have children. Yet. But I hope to, someday. And what will I say to them, when they ask me to explain why I stayed on Twitter, why I gave my information so freely to Jack and Zuck? Why I continued to endorse a platform exploited by Russian operatives to disrupt our democracy? Why I kept clicking on Facebook, knowing that Facebook profited from ads that spread lies like a virus?

Will it be enough when I say that my mom liked to see my pictures on Facebook, and I liked to talk to my librarian friends on Twitter?

I can see it now: they will scoff, twist their multi-colored bangs, and sigh, “How could you have been so stupid?”

justin-main-189770

Photo by Justin Main on Unsplash

What would it look like?

Here’s an exercise.

Try to imagine a library that does not care about its users.

What would it look like?

Let’s say that it’s an academic library on a large, urban campus, that serves tens of thousands of students.

What kind of library would it be if it didn’t care about those students?

It might look like this.

There would be no consequences. It wouldn’t really matter what the library did, or if it did it well. The library would have vague statements about its mission and goals, but there would be no measurable outcomes associated with any of the library’s spaces, services, or collections. This would include the library’s multimillion-dollar budget, which would only have a single budget code, so there would be no way to itemize how the library spends its funds. If you ask where the money comes from and how the budget is determined, someone will laugh and say, “Oh, that number is written down in a drawer somewhere.”

There would be no consequences for leaving obscenely large amounts of money unspent, year after year. Unused budget funds would be put into an ongoing, never-ending renovation that leaves the building in a constant state of uncertainty, chaos, physical disarray, and distracting noise. New spaces would be built without description, purpose, or plans to staff them. The library would celebrate the “substantial completion” of the renovation, complete with a ribbon-cutting and replica cake made of fondant, and then the renovation would continue for another year.

There would be no consequences for employees, whose low performance would never be punished and whose outstanding performance would never be rewarded. Non-tenure track library faculty would be employed continuously without appointment letters or contracts. Salaried employees would come and go as they please, sometimes being late to meetings in the afternoon because they simply hadn’t come to work yet that day. Instruction librarians would be late to classes, leaving students and course faculty waiting. The instruction scheduler would be baffled by Microsoft Outlook and its calendaring system; they would assign classes incorrectly, neglect to send instruction confirmations, and humiliate the teaching team. The scholarly communication librarian would hate Open Access. Public services staff would really prefer to work in the back of the library. Instruction librarians would be afraid of speaking to large groups. Collection development librarians would look at crumbling books and say smugly, “A worn collection is a used collection.” Student workers, without supervision or guidance, would ride skateboards through the staff area.

There would be no consequences for not having a faculty handbook, for not following the established rules of shared governance, and for deliberately violating by-laws. Decisions would be made based on an e-mail someone sent once, or how things were done last year, or  something someone overheard in a meeting. Promotions would be given based on individual employees and their needs and desires, rather than the goals of the organization (there are no measurable goals, anyway). Knowledge management would be practically non-existent, with documents scattered between a shared drive, an intranet, and cloud-based software. Policies and procedures would refer to individuals by name, rather than by their position or role.

The university responsible for this library wouldn’t particularly care who was in charge of it, and would leave interim leadership in place for years. Interim reporting lines would cascade as mid-level management left the organization, so employees would be in “continuity of operations” plans indefinitely. The university would open and close a search for a Library Director, declaring none of the candidates “viable” because they do not meet the requirements of the rank of Full Professor. Nevermind, of course, that no one in the library has ever been promoted to Full Professor, and nevermind that only three of the library’s two dozen faculty are tenured or tenure-track. Nevermind that what the library really needs is an effective manager, not a scholar.

If this library didn’t care about students, they might or might not keep any data about how the library is used, and if such data were recorded, it probably wouldn’t be regularly reported or used to make decisions in any way. The library’s operating hours and its services would be available randomly at the whims of the library, whenever it felt like staffing things, whenever employees were available. On-boarding for new hires would be random and haphazard. There would be no orientations or procedures or checklists or training manuals. There would be no quality checks to see if things were being done well because who would decide what that looks like?

Who is actually in charge? Look at the staff directory, it says vacant.

If this library didn’t care about students, it wouldn’t keep them safe. Intoxicated people would interrupt instruction sessions and refuse to leave the classroom. People would camp in the building overnight. Security guards would gently nudge sleepers, then let them fall back asleep. It’s understandable, of course, that the library would be a popular place for anyone seeking refuge-the library is the only building on campus where community members cannot be trespassed. Students would leave the library, complaining about these safety issues, and study somewhere else.

If this library didn’t care about students, it would be impossible to retain faculty and staff who do care about them. Those people will get angry and exhausted. New hires would be undermined and sabotaged. Competent employees would be labeled as “over-ambitious.” People would leave this library, choosing lower-paying jobs, longer commutes, positions outside of libraries, expensive cross-country moves, or outright unemployment, simply to get away from the dysfunction.

The turnover rate would be high, but the remaining employees would tell themselves it’s somehow normal. “That person really wanted to get back into public libraries,” they would say. Or, “Their spouse got a new job out-of-state, so they had to go.” Some people stay just long enough to get a better job title to put on their CV, a reward for putting in their time, and then they would move on, too.

So the leftovers would settle in, determined to outlast all of the perky people with new ideas, and wait. What is there to lose? There are no consequences, anyway.

neonbrand-335257

Featured image by NeONBRAND

On Angie Manfredi’s resignation from the Newbery Committee.

To the ALSC Executive Committee & Directors,

I was extremely disappointed to read Angie Manfredi’s blog post explaining her resignation from the 2018 Newbery Committee.

Some people might say that the details behind Angie’s resignation don’t matter. I believe the details do matter, and they matter a lot. In fact, it’s the details of this story that make my stomach turn.

I understand that Angie was asked to resign because she shared a story about her job as a children’s librarian on Twitter. Specifically, she shared a story about a young reader of color at her library who was excited to read a book that reflected his life and interests. Yes, Angie praised the author and publisher of the book for providing a story that connected with this young reader. This brief anecdote was widely shared as an example of the importance of diversity in children’s literature. And for this attention, for this highlighting of the need for diverse books, you determined that Angie gave the appearance of an inappropriate relationship with the author and publisher.

Shame on you. It can’t be said enough, so I’ll say it again. Shame on you.

Let’s consider all the messages that are sent by Angie’s resignation:

Celebrating diversity in children’s literature is an inappropriate activity for ALSC award committee members.

ALSC award committees are only interested in librarians who can comply with outdated procedures that silence and limit a librarian’s professional contributions.

White supremacy is the highest value in librarianship.

With your decision, you have left no room for otherness. What I mean is, how could you expect a librarian of color to want to participate in an ALSC award committee after this decision? Or a queer librarian? Or a librarian living with a disability, or a mental health issue? If they speak out publicly, in any way, about their work, their patrons, their excitement for diverse representations in children’s literature, they will be asked to resign. Because of the appearance of bias.

I hope it is has been made very clear to all of us in 2017 that there is no neutrality in librarianship. I am personally humiliated by the resignation of Angie Manfredi from the Newbery Committee because I feel it cheapens the reputation of librarians everywhere. How can we claim to support our communities when we punish librarians like Angie for doing their job, for celebrating literature, and for acknowledging the work of authors and publishers to make the world a better place?

I share the opinion with many others that Angie is one of the most valuable librarians we have working today because she actively criticizes and critiques the field of librarianship. We need more librarians like Angie Manfredi, and we need them to serve on more committees, and we need them to provide examples of how to lead.

I look forward to a public response from ALSC that acknowledges a plan to update the policy for service on awards committees to avoid situations like this in the future.

Sincerely,

Zoe Fisher
Stonewall Award Committee Member, 2018
MLS, Emporia State University, 2010
BA, Oberlin College, 2008
www.quickaskzoe.com