On a quiet beach on the island of Oahu stands a historical marker noting the location of the RADAR site at Opana Point. Early on the morning of December 7, 1941, soldiers stationed at this site reported that their brand new RADAR technology indicated planes approaching the island. They were told that what the tool was showing was *probably* planes expected from California, and not to worry about it. Shortly thereafter, bombs fell on Hickam Field and Pearl Harbor, igniting U.S. involvement in World War II.
What does this have to do with student success and institutional research (IR) you ask?
It’s not a newsflash that institutional leadership continues to explore the potential of “big data” and analytics to build capacity, support student success initiatives, and inform decision-making.
In support of this endeavor, IR professionals are being asked to step outside “pure research” and engage with techniques and methodologies more typically associated with business intelligence and insight. In many cases, it’s assumed that these trusted IR professionals will step up as “data coaches” and/or will naturally provide data visualization support to a variety of new stakeholders.
These aren’t easy adaptations for many IR teams to make.
As a result, compelling student success findings are routinely questioned by those who are wedded to null hypothesis research methods, or who insist upon using inferential techniques when reviewing large N results where patterns can reveal meaningful insights. Many IR veterans still exhibit disdain for the so-called “data mining techniques” associated with pattern strategy and business intelligence accepted in other sectors.
In many ways, I fear that data analytics for higher education are like RADAR: worthy of investment, but not yet accepted as true evidence.
The marker at Opana Point states, "The failure to warn the Army or Navy command in Hawaii was not a failure of the technology as much as it was a failure of organization. While the technology of RADAR functioned as intended and detected the incoming planes, there was no way to accurately assess the information and communicate this knowledge to those in command."
I can’t help but wonder if the transformative evidence coming out of higher education analytics will be disregarded simply because the technology and techniques are too new to be taken seriously. And perhaps this, too, is a failure of organization.
The Association for Institutional Research (AIR) recently published A Statement on Aspirational Practice for Institutional Research which indicates the need to reshape institutional research functions to be more proactive, more consultative, and more focused on solving institutional problems with evidence.
It is my hope that the statement from AIR and other experts will help the profession of institutional research move towards the ability to trust and act upon the data as it is revealed. Dismissing the facts and patterns from an institution’s student data won’t lead to war, of course. But there may very well be casualties in retention and completion that our country can’t afford.
Swing, R. L., and Ross, L. E. (2016). Statement of Aspirational Practice for Institutional Research. Association for Institutional Research, Tallahassee, Florida. Retrieved December 12, 2017 from http://www.airweb.org/aspirationalstatement