Hobsons is committed to conducting original research on questions of student success with our members and partners. Like any researchers, we are especially proud when our research is published in peer-reviewed journals. Most recently, “The Value of Common Definitions in Student Success Research: Setting the Stage for Adoption and Scale,” appeared in the Spring/Summer 2017 issue of the Internet Learning Journal from the Policy Studies Organization. Karen Vignare, Karen Swan and I shared findings and outcomes from the development and use of common data and student success intervention definitions that have helped validate so much of our ongoing modeling, benchmarking and A/B testing efforts at Hobsons.

Dr. David Longanecker, former undersecretary of the US Department of Education and president of the Western Interstate Commission for Higher Education (WICHE) – and my former boss – once counseled me that, as a profession, education places significance credence on the opinions of its senior leaders. Why should ed-tech be any different? Yes, it’s important to get recommendations from peers and to listen to internal and external experts. But wouldn’t you also like to have some objectivity in the mix? Especially when your decisions affect the futures of so many young people?

Academic research has always been part of the PAR Framework DNA, and increasingly embedded in everything we do. Our work with predictive models is based in hard evidence, using the same valid, rigorous, reliable research methods that any self-respecting academic would expect from a colleague. It’s not always easy to make time for the process of study, analysis, partnership and publication, but we believe it’s essential to continue this work in the PAR tradition. People depend as much on how people come up with answers to questions as they do on the answers themselves. In education, NOBODY likes the black box.

Here are some of the other peer review studies we’ve published this past year:

  • “Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches.” Wagner, Ellen and Longanecker, David. (2016). Change: The Magazine of Higher Learning, 48:1, 52-59. Click here to read the article.
  • “Retention, Progression, and the Taking of Online Courses.” James, Scott; Swan, Karen; and Daston, Cassandra. (2015). Online Learning, Volume 20 Number 2 (22 December 2015). Click here to read the article.
  • “Online Learning, Student Success and Data Analytics.” Swan, K., James, S., Daston, C., & Wagner, E. (2016). In S. Whalen (Ed.), Proceedings of the 12th National Symposium on Student Retention, Norfolk, Virginia. Norman, OK: The University of Oklahoma, 318-326. Click here to read the article.
  • “Online Learning Gets a Passing Grade: How Online Course Taking Impacts Retention for University Students.” Swan, K., & James, S. (IN PRESS). In S. Whalen (Ed.), Proceedings of the 13th National Symposium on Student Retention, Destin, FL. Norman, OK: The University of Oklahoma, pp. TBD).
  • “Are Course Withdrawals a Useful Student Success Strategy?” James, S. and Akos, P. (IN PRESS). In S. Whalen (Ed.), Proceedings of the 13th National Symposium on Student Retention, Destin, FL. Norman, OK: The University of Oklahoma, pp. TBD).

Are you engaged in student success research? We would love to hear about some of the work you may be doing so we can share the word, get involved, find others who might want to get involved and help share ideas. So much work continues to be done!

Recommended

Visit the Resource Center