What it does
The AI processes the data already sitting in your system and turns it into signals your faculty can act on. No data warehouse project. No six-month implementation.
At-Risk Identification
The AI scores every enrolled student on a risk scale each week, based on attendance gaps, assignment submission patterns, grade trajectories, and LMS login frequency. When a student crosses the threshold you set, the instructor and advisor both get an alert.
Outcome Prediction
A predictive model estimates the probability that each student completes the course with a passing grade. Faculty see these probabilities updated weekly and can filter by likelihood range to prioritize outreach to students in the 40-60% zone.
Engagement Patterns
Heatmaps show when students are active, which content they spend time on, and which resources they skip entirely. Faculty can see that 70% of the class watched the Week 4 lecture but only 23% opened the supplemental reading. That is actionable.
Early Alert System
Automated alerts go to the right people at the right time. Instructors get notified about their students. Advisors get notified about their caseload. Department chairs see aggregate risk trends. Each role gets the signal without the noise.
How it works
Three steps between turning this on and seeing your first risk report.
Connect your data
Grades, attendance, and LMS activity are already in OpenEduCat. The analytics engine reads from those same tables. There is no data export step, no CSV upload, and no third-party connector to configure. If the data is in the system, the AI can see it.
Set your thresholds
Define what "at-risk" means for your institution. Maybe it is two missed classes plus a declining grade trend. Maybe it is no LMS activity for 10 days. Each department can set its own thresholds. The default settings work for most institutions on day one.
Review the dashboard
Faculty see a weekly risk summary for each class. Red flags for students who need immediate outreach. Yellow flags for students trending downward. Green for students on track. Click any student to see the full trajectory and the specific signals driving the score.
What the dashboard actually looks like
Here is a sample risk report for a single section of Introduction to Biology (BIO 101), as Dr. Marcus Chen, who teaches 4 sections of 40 students, would see it in week five.
BIO 101 — Section 3 — Fall 2025
Weekly Risk Summary — Week 5
4
High Risk
Completion probability below 40%
7
Moderate Risk
Completion probability 40–65%
29
On Track
Completion probability above 65%
Top Signals This Week:
- Student #1847: 3 consecutive missed classes, Lab 2 not submitted, no LMS login in 8 days
- Student #2201: Quiz scores declining (92 → 78 → 61), assignment submissions 24+ hours late
- Student #1593: Attendance steady but quiz scores below section average by 1.4 standard deviations
AI Recommendation: Send check-in emails to the 4 high-risk students this week. For Student #1847, consider a meeting with their academic advisor. The pattern matches students who withdrew mid-semester in 3 of the last 4 terms.
Bring Your Own Model
The analytics engine uses the AI model you choose. Paste your API key from OpenAI, Anthropic, Google Gemini, or connect a locally-hosted model running on your own infrastructure. Student data goes directly from your OpenEduCat instance to the model provider you selected. OpenEduCat never stores prompts, completions, or analytics results on our servers.
Districts with strict data residency requirements can run a local model behind their firewall. The analytics results never leave your network. Your IT team controls the model, the data flow, and the usage limits.
Frequently Asked Questions
Common questions about AI learning analytics in OpenEduCat.
More questions? Explore all 91 AI tools or see the LMS module that feeds data into these analytics.
Ready to Transform Your Learning Analytics?
See how OpenEduCat frees up time so every student gets the attention they deserve.
Try it free for 15 days. No credit card required.