Skip To Content

Virtual Insights Recap: Takeaways for Higher Education Professionals

Reading time: 5 minutes
higher education professionals

By Earl Sires, Rapid Insight

Virtual Insights, our first major online user event, was packed with interesting information for higher education professionals. It featured a total of seven distinct data-focused sessions, ranging from high-level discussions to practical, in-product education.

The entire event is well worth watching, but in case you’re short on time and interested in some of the bigger takeaways from our expert guest presenters, we compiled this blog post to deliver some of the most meaningful insights in a digestible form.

Here is our top takeaway from each of our guest presenters!

“Create a Matrix-Driven Data Culture”

At the University of Southern Maine, Jared Cash and his team implemented a “Matrix Model” to create a centralized team of data experts embedded in a variety of the University’s departments. Essentially, a member of the staff in many of the major departments is part of the data task force, which collaborates across departments on larger data initiatives at the institution.

This allows the data team to develop localized, expert knowledge pertinent to each department’s needs. Rather than making assumptions about data needs, Cash and his team are able to gain direct insight on specific priorities and pain points.

Typically, the embedded data experts in each department work under that department’s director: for example, a Marketing Data Analyst stationed under the Director of Marketing & Brand Management. In some cases, the Director themselves is a member of the data task force, as in the case of the Director of Student Information Systems.

At your University, it may be worth forming a similar matrix driven model to leverage your existing staff resources to the maximum extent possible. Cash recommends that universities “build localized talent first, then add bandwidth centrally that can sustain and support data intelligence that aligns with mission priorities .” Disconnected analysts or data-inclined employees working independently to serve the goals of their department are valuable members to recruit as part of a organization-wide data taskforce.  

Watch Jared Cash’s full presentation here.

“We already have a ton of information about whether or not a student is going to be successful”

During our user discussion panel (“The Big Questions in Higher Education”), Bryan Terry, the Vice Chancellor of Enrollment at Arkansas State University, remarked on the topic of making standardized testing an optional admissions metric.

Terry discusses the process of determining which factors influence student success as a retrospective analysis. When looking at successful undeclared or first-generation students, what factors consistently appear in their application, financial aid materials, and letters of recommendation? There are likely commonalities that can be cataloged and used to build a library of significant variables that lead to student success. 

In this sense it will be possible, over time, to identify a range of alternative metrics that will be useful in determining student success without standardized test scores.

“We’re asking our population to allow us to show other data,” said Terry. “Of our students, ACT is not as strong a predictor as GPA.” A study published in Educational Researcher supports this conclusion. In fact, the study found that high school GPA is 5 times more likely to be a predictor of graduation than ACT scores.

Watch the full panel discussion here.


“Make data friendly, and consult with the groups that actually use it”

Phebe Soliman, the Dean of Institutional Research in the Office of Institutional Effectiveness at County College of Morris, contributed thoughts on establishing institutional support for data initiatives during the user discussion panel for higher education professionals.

Soliman advised that top-down endorsement and support is critical to success, but that implementation is equally important and requires different tactics to establish.

Often, success in implementation comes down to building trust with the departments who are not yet bought-in to a data initiative. It could be that a particular department has used the same reporting system for decades. Perhaps they’re reluctant to move on to something unfamiliar, or skeptical that it will deliver the same quality of results.

Consultation, conversation, and perhaps most critically, persistence, are critical to success. “They have different ideas about what they want; we only try to interpret and understand,” said Soliman. “There’s a lot of collaboration.” Taking the time to understand what a department needs and how your data initiative can serve that need will go a long way toward establishing trust. 

Ultimately, establishing good buy-in at the implementation level comes down to helping end users achieve their goals. If you can demonstrate how your data initiative will lead to better analysis, conversations, or decision-making, and how that plays into a department’s existing goals, you’re much more likely to get the ground-level support needed for an initiative to flourish.


“A modeling basis for scenario planning”

Due to the instability COVID-19 introduced to historical data sets, the efficacy of predictive modeling has been called into question as of late.

Laura Miller, the Director of Institutional Research at Messiah University, noted during the user discussion panel that while desired outcomes may change, predictive modeling is still a viable and useful tool. However, you may need to reassess your procedures and goals.

Messiah is currently tracking what impact coronavirus had on the most significant variables incorporated in their predictive models. For example, admissions yield models tend to heavily weight in-person events and campus visits, which are more limited in nature and also will take place virtually this year. 

This means the admissions model can’t directly predict admissions this year in the way it did in the past. However, the model can still forecast a range of possible scenarios since it will demonstrate the impact of changes to in-person events and campus visits.

These insights are used to guide conversations about how to move forward with a potential impact on finances in mind.

Additionally, incorporating alternative metrics like open and click rates for electronic communications will take on increased importance, and can potentially be used as a basis upon which to build more accurate models in the future.

Another tip Miller offered for higher education professionals is that crises like COVID-19 sometimes open doors that were previously closed. If there is data that your department needs access to for more robust analyses, there may be a greater willingness to grant that access when the need is clear and present. In the short-term, the model benefits from richer data, and the success of the model could justify ongoing access to the data once the crisis subsides.


Until Next Time…

While this year’s user event was quite different in its format, the sessions were nonetheless a valuable look into the minds of higher education professionals who work with data.

In addition to the sessions featuring guest speakers and panelists, Virtual Insights featured two Deep Dive webinars. Each session was an in-depth tutorial for users of our data software products: Construct (for data preparation) and Predict (for predictive modeling). 

We hope you’ll join us at next year’s event, whether it be our traditional in-person conference or Virtual Insights 2.0!

Stay up to date
Subscribe to our blog

Notify of
Inline Feedbacks
View all comments