Learning Analytics

At this session, Provost Tristan Denley, Austin Peay State University, will discuss "Degree Compass," which uses predictive analytics techniques --based on grades and enrollment data-- to make individualized course recommendations for students. 

Lunch will be provided. 

Sponsored by the Provost's Task Force on Learning Analytics, the Student Learning and Analytics at Michigan (SLAM) Seminar is a year-long speaker series. For more information about learning analytics at U-M and to view videos and slides from the 2012-2013 SLAM series, click here.

For information about the 2011-12 Symposium on Learning Analytics at Michigan series, click here.

Event Information
Start Date: 
Fri, 01/18/2013 - 12:00pm
End Date: 
Fri, 01/18/2013 - 1:30pm
Presenter(s): 
Dr. Tristan Denley, Provost, Austin Peay State University
Eligible for Certificate: 
Not eligible for Certificate
shadow

The links in this section provide guidance about data sources available for measurement of student learning for assessment or research.


Key Definitions & Frameworks

Data sources that are useful to consider in assessing student learning are:

  1. Evidence of learning outcomes

    Direct measures of learning

shadow

How can we use available data about students to fine-tune our instruction and facilitate their learning? Thanks to the Learning Analytics Task Force and SLAM lecture series, this question is getting lots of attention on campus this year. Some especially innovative answers are provided by 2012 TIP winners Tim McKay, David Gerdes, and August Evrard (pictured below, left to right), whose "Better-Than-Expected" (BTE) project used analysis of large data sets to support student learning in introductory physics courses. 

The three Arthur F. Thurnau professors analyzed data from 48,579 U-M intro physics students over 14 years to generate models for predicting student success in these gateway courses. Correlating data concerning students' preparation (e.g., standardized test scores, prior U-M GPA, previous coursework, etc.), background (gender, socioeconomic status, etc.), and progress through the courses (homework grades, exam scores, class participation, etc.), the BTE team discovered that prior academic performance was a significant indicator of success in the introductory courses. In effect, students' progress through the semester was largely determined by their starting point. The team realized that, in order to develop the learning potential of all students, they needed to move away from a "one-size-fits-all" instructional model.

Enter E2Coach.  With support from the Gates Foundation, the group built an Electronic Expert Coaching system which they launched across all intro physics courses in January 2012. Read more »

shadow