I want to talk about a strange irony that I’ve been thinking about lately. It has to do with the ways in which information technologies — and all the wonderful things they afford us — may have actually, somewhat indirectly, encouraged universities and colleges into a “zone of safety” that, ultimately, may be spelling the demise of these institutions.
Bear with me. I know that sounds a little dire. I’ve been told I have a flair for the dramatic, and I’m done fighting it. š
What I’m interested in, on a very basic, level is data. I would suggest that it was about 20 years ago that University administrations really began to realize that technology could help them be more efficient and responsive to their student clients, and, as a result, they began to invest in information systems, first main-frame based and more recently of the data center variety. With those systems, suddenly the University had tools at it’s disposal to start collecting all kinds of information about all kinds of things — students, faculty, classes and enrollment, scheduling, institutional projects, the list goes on and on.
I’ve noticed a funny thing happens when people realize they can gather data; they automatically assume they should. And when institutions are the ones doing the data collection, they automatically assume they should use that data to become better businesses. Even if being a “business” isn’t their core mission.
I look around at some of the things that I think plague higher education:
* a greater emphasis on careerism rather than education
* valuing courses over people (and the connections between them)
* creating curriculum based on maintaining course enrollments rather than building a culture of learning
* marketing universities with information about how long it takes to graduate, how many graduates land jobs, and the latest ranking in US News and World Report rather than finding ways to expose the life of teaching, learning, and research at the University and letting that speak for itself
I start to wonder how many of these decisions have been enabled by some well-meaning administrator’s analysis of data using the latest, greatest tool for data analysis.
Now, I’m not arguing that collecting data is bad. I’m definitely not saying that. I’m also not arguing that analyzing data is bad.
Rather, I’m concerned that our analysis of all of that data and information isn’t happening in the context of an ongoing, rigorous, creative conversation about the mission of higher education. We should collect the data, we should use the data, but sometimes we should be brave enough to say “Data be damned! Let’s do the right thing.”
A few weeks ago I was listening to an episode of the podcast “Ockham’s Razor” in which Australian scientist John Bradshaw discussed his experience of getting a PhD at Cambridge 40 years ago. He described how he was able to rig up a lab to do detailed analysis of photographs we was taking of subjects’ eyes (he was analyzing their irises) in the basement of a building that his department had just acquired. He didn’t ask for permission. His advisor never even know about his arrangement until he turned in his thesis two years later:
That’s how things often happened in Britain in those days, laissez-faire, sink-or-swim, all very different from the carefully civilised apprenticeship closely integrated into the lab’s overall strategic plans, of the modern science PhD, often with a committee of supervisors closely following, and often squabbling over a student’s progress. Nowadays, graduate students are more of a work-horse whose success is hardly less important to their supervisors’ careers than it is to their own. However certain personalities, such as my own, take well to being left free to explore the world of science in their own way and in their own time. That cellar was just great!
It may be a stretch, but I think this anecdote is related to my sense that we’re allowing ourselves to over-engineer the experience of getting an education — and often we’re doing it on the back of the data that we’re collecting and carefully analyzing.
I worry that we’re so busy making sure we’re doing what’s strategically right according to that data that we’re forgetting about the role that play, serrendipity, imagination, risk, and even failure can and should play in education.
And for me, as someone who works with technology and works to promote the transformative effect it can have on teaching, learning, and research, it intrigues me that the flip side is that technology’s integration into the University may have led us down this path.
Can’t we see this as another of a series of poor attempts made to deal with the information overload that all of us have been facing? Both faculty/administrators and prospective/current students/parents have to figure out some way of addressing the role of the increasingly expensive collegiate experience. Colleges have to justify their prohibitive expense and parents (and increasingly students) want that justification spelled out for them (and want a measurable return on their investment). The vast amount of data available today about schools and the college experience means that parents and students are easily overwhelmed in their choices. A ranking system allows those parents and students to cope with that overwhelming set of data, providing a set of āconcreteā justifications to hang their decisions on. Rankings systems (based on that data) also allow colleges to address (at least in appearance) questions of fiscal accountability (without really exploring substantive external or internal questions about the links between āvalueā and āeducationā). Itās not a perfect system, but the structure that data built does allow a kind of compromise method for all these actors to discuss higher education in a manageable way.
But ultimately this system is far from perfect and reveals a substantive failure of academia to properly identify and explain its role. The argument we should be loudly and broadly and proudly making is that the educational experience you (and Gardner and Steve and so many others) are writing about (learning focused; interdisciplinary in all the best ways; playful; collaborative and individualized; potentially, though not necessarily, technology-enabled) is worth the money spent because it _does_ make graduates better able to succeed in the work force, as well as making them better citizens, better friends, better voters, better people….
The data-driven approach to education (epitomized by the US News and World Report Rankings, but perpetuated by many others) appeals to people (and always will–it’s easier and itās minimally satisfying). Of course, if we consider quantitative literacy as important as written, aural, and visual literacy–and what good liberal arts program wouldn’t?–then we could teach students (and their parents) as well as our fellow academicians how to look behind those stats to see the assumptions behind them. And letās turn all that data (and the tools for presenting it) in our favor. Admittedly, many of the benefits weāre talking about are not easily quantifiable. But that doesnāt mean that we canāt quantify them or present them in new ways. Thatās why itās so important that we continue to develop the ways to make individual and community educational experiences visible to ourselves and to others that weāve been working on at UMW.
[I’ll also post a revised version of this at http://mcclurken.blogspot.com/2007/07/data-information-overload-and-selling.html%5D
Really thoughtful post. My one small push back is that I think we need to look at these trends in the broader societal context. Most of the disturbing trends you identify are happening everywhere. Life is increasingly commodified, which means material considerations trump all those soft human values. Demanding that the information we use to inform our decisions be quantifiable is a means of ensuring that material values win out.
Jeff — as usual, thanks for taking a rough idea I threw out there and deepening the analysis! You’re right — it IS a kind of information literacy. And the challenge of “making visible” the work we do is, in my mind, paramount to fighting the urge to simply quantify, quantify, quantify. If we can expose the ways in which students are deeply engaging with themselves, each other, faculty, and the world beyond through study and research, we can turn this conversation on it’s head, I think. It’s like that vision you keep having of showing incoming students/parents a tag cloud of the University over the last 24 hours. Let’s not just show them US News statistics — let’s also show them our minds, our successes and our failures, our deep connections with one another.
Brian — You’ll get no disagreement from me that this trend is frighteningly widespread. As the parent of a young child, I regularly worry about how to fend off the commodification beast. That said, I think it’s easier for me to understand how business and commercial interests fall prey to these tendencies. When your mission has always been the bottom dollar, more data to help you get more money seems like a no-brainer. Mind you, I still wish we saw more companies out there committed to doing good not just doing good by their pocketbooks. Higher education, on the other hand, should answer to a higher power, I think. It’s disgraceful that we’ve allowed ourselves to let the bottom dollar become our new driving force. I say that know full well that, yes, there are institution and economic realities to be faced. I’m not trying to be naive or to gloss over the tough decisions administrators face in resource-lean times. I think thin we need to find a way to be setting the example, defying those cultural tendencies, and holding up for everyone a vision of what another set of values (a better, more ethical, more *human*) set of values looks like. Otherwise, let’s just go home.
This is a really important and timely conversation. Did anyone catch the June NY Times report re: an organized wave of liberal arts colleges dropping out of participation in the US News and World Report survey? http://tinyurl.com/3c589y
Another comment on issues of measurement came recently from the President of Princeton University, Shirley Tilghman, in her address at last May’s commencement. Here she focused her entire discussion upon the question of how we should assess the quality of an education offered by colleges. Tilghman aimed a direct critique at the current push at the Dept. of Education to impose standardized testing at the university level. She also broadened the discussion to the public concern (esp. that of parents) for measuring university performance – a growing concern in a day of growing collegiate debt and a free market culture. Fittingly for a Princeton President, she draws on Woodrow Wilson to articulate the ambitions faculty hold for the education they impart. For the complete text of the speech, an articulate argument, see: http://tinyurl.com/2frxfb
Educators who share the same vision of the project as being not so much about “learning in itself as in [imparting] the spirit of learning” (to borrow from Tilghman’s own extended quote of W. Wilson) are working to bring this discussion to a broader public forum. It’s good to see this happening locally as well– Martha, your ambition for setting an example at UMW is one that, as we saw last night, has real momentum. I’m looking forward to continuing the conversation…
A really thoughtful thread presented here. And I composed an extended reply yesterday (which was promptly eaten by a fitful computer unfortunately.) Two items I was pointing to, though, were the recent decision by a large group of liberal arts colleges to withdraw from the US News and World Report rankings (see: http://tinyurl.com/3c589y , if the NY Times site will cooperate) and a recent commencement speech by Princeton University President Shirley Tilghman.
The latter item directly addressed the question of concern – parental concern in particular – for measuring the quality of a university education (with subtle ref. to the current culture of a free market system.) Her speech was also a direct critique of the US Dept. of Education’s push for standardized testing at the university level. Great stuff. http://tinyurl.com/2frxfb
It’s v. good to see that there is a broader move to articulate the value of a lib arts ideal, and to counter the constant pressure to quantify, measure, rate… The trick is to bring an alternate presentation of the quality of an education. Fortunately, I think we’ve got a real momentum rolling here at UMW in this direction. (This was esp. apparent at Gardner’s event two nights ago!) I’m looking forward to continuing the conversation…
Urk! The thread finally got itself posted, twice, after I thought my computer had swallowed an early version. Apologies for vague repetition…
I’d actually seperate out the professionalization of the university from the overengineering of process and metrics.
Why? Because my experience is that private industry is considerably more fleet of foot than the University. I’ve been at a number of companies in my career, and the problem with the University (IMHO) is that we romanticize process over results, and we do so because there is no effort to engage with the outside world — so we get these weird feedback loops where all that is valued is the process.
(and that, oddly enough, is the same critique that is often made of the modern IT department).
Sorry — I’m not being terribly clear here — but you are very right about the data becoming unglued from the discussion of mission. I’m just wondering if there isn’t something that has ALWAYS been off in higher education — the sort of thing you see parodied in Amis’s Lucky Jim, etc. back in the 50s.
You’ve been tagged with “Ate Things” a meme I have finally gotten around to fulfilling. Here’s the post.