📊

SEAtS Engage

Engagement scoring, data sources, dashboard interpretation, and automated workflows to identify and support at-risk students.
Student Support3 levels · 12 lessonsSelf-paced

Foundation — SEAtS Engage

For new users — essentials to get up and running confidently.

4 lessons · Foundation
1 What the engagement score means

The SEAtS Engage score (0–1,000, average = 500) is a relative, peer-referenced measure of how engaged a student is compared to their cohort. It is not an absolute value — a score of 500 always means "average for this group", regardless of the absolute engagement level.

  • 500 is always the peer-group average; above means more engaged than peers, below means less
  • A score below 350 warrants attention; below 300 is a significant concern typically triggering a workflow action
  • Scores update weekly as new data from all connected sources is processed
  • Scores are programme-relative by default — a Business student is compared to other Business students
  • Students cannot see other students' scores — only their own history and standing

Tip: When discussing a score with a student, describe the trend rather than citing the number. "Your engagement has been lower than your peers for three weeks" is more constructive than quoting "287".

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about what the engagement score means is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

2 Reading the Engage dashboard

The Engage dashboard gives a bird's-eye view of cohort engagement health. The distribution chart shows how students spread across score bands; the declining filter surfaces those trending downward; drill-down lists move you from overview to individual action in one click.

  • The score distribution chart shows student counts across 200-point bands — a healthy cohort has a roughly normal distribution around 500
  • Filter by faculty, programme, year of study, or custom cohort using the filter bar
  • Click any bar in the distribution to see the students in that score range
  • The "Declining" filter surfaces students whose scores dropped more than 100 points in the last four weeks
  • The "Consistently low" filter shows students below 350 for three or more consecutive weeks

Tip: Run the Declining filter every Monday alongside your attendance review. A declining score with still-acceptable attendance is a very early warning — catching these students before they miss sessions is the highest-value intervention available.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about reading the engage dashboard is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

3 Identifying students who need support

The Engage score doesn't replace professional judgement — it surfaces patterns that might go unnoticed in a large cohort, helping staff prioritise who to contact this week. A score of 280 with no prior contact is a different priority to 290 with an open support case and regular tutor meetings.

  • Build a prioritised contact list: score below 350, no open case, no contact in the last 14 days
  • The "No contact" column shows days since any staff member logged an interaction
  • Cross-reference the Engage score with attendance percentage — if both trend down together, the signal is stronger
  • Declining scores with still-present attendance may indicate academic or personal difficulty that attendance data alone wouldn't reveal
  • A score improvement of 50+ points over two consecutive weeks is a positive recovery signal worth logging
📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about identifying students who need support is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. Which statement best reflects the guidance in this lesson?

4 Student-facing engagement views

Depending on your institution's configuration, students may see a simplified version of their engagement data through the student portal or app — using plain language and trend indicators rather than the raw score, which can be misinterpreted without context.

  • The default student view shows a trend indicator ("your engagement is below average") rather than the numeric score
  • Administrators can enable the numeric score for students in Admin → Engage → Student Portal Settings
  • Students see only their own data — no peer comparisons to named individuals
  • The student portal view can be disabled entirely if your welfare team prefers staff-only access
  • Students can add context notes to their profile (e.g. "family bereavement this week") visible to their tutor

Tip: If you enable student-facing scores, brief your personal tutors first. Students sometimes raise their score in tutor meetings and tutors need to be ready for an informed, supportive conversation.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about student-facing engagement views is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

Practitioner — SEAtS Engage

For experienced users — deeper configuration and workflows.

4 lessons · Practitioner
1 Data sources: what feeds the score

The Engage score is a weighted composite of data from multiple sources. Each source you connect adds depth — but each requires clean, consistent data. A low-quality source reduces score reliability; a high-quality source significantly improves the model's predictive power.

  • Available sources: SEAtS attendance, VLE logins (Moodle, Canvas, Blackboard, Teams), library entry swipes, digital resource access, submission records, campus Wi-Fi, card door access
  • Enable a source in Engage → Config → Data Sources — it begins contributing from the next weekly run
  • Each source shows a data quality score (0–100%) — sources below 80% are contributing noise; investigate before enabling
  • Disabling a source doesn't change historical scores — it only affects future calculations
  • Attendance is typically the highest-weight source as it is the most reliable and consistent

Note: A VLE source with large numbers of missing records can actively harm score quality. Check data quality scores before enabling any new source, and review them quarterly.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about data sources: what feeds the score is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

2 Configuring alerts and workflow triggers from Engage

Engage scores can trigger automated workflows independently of attendance alerts. A student whose score drops below a threshold can trigger a tutor notification, open a welfare case, or send a check-in message — all without manual intervention, transforming Engage from a reporting tool into an active early-warning system.

  • Configure Engage-triggered workflows in Engage → Config → Alert Rules → New Rule
  • Trigger options: score below threshold for the first time, below threshold for N consecutive weeks, or drops more than X points in a single week
  • Actions: email the student, notify their tutor, create a CRM case, or trigger a full workflow sequence
  • Combine Engage triggers with attendance for compound rules: e.g. score below 300 AND attendance below 75% triggers a priority welfare referral
  • Preview impact before saving — the system shows how many students would currently be in scope

Tip: Start with a single conservative rule: score below 300 for two consecutive weeks triggers a tutor notification. Measure volume, refine the threshold, then add more rules once you understand the signal quality.

📝 Quick Check Answer all 3 to earn your badge

Q1. Where in SEAtS ONE would you find this setting: "triggered workflows in Engage → Config → Alert Rules → New Rule"?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

3 Using Engage alongside the CRM

Engage data becomes most powerful when directly connected to CRM casework. A student flagged by an Engage alert who already has an open welfare case needs a different response than one appearing for the first time. SEAtS integrates both views in the student profile — the Engage score trend appears alongside case history.

  • The student profile shows the Engage score trend graph alongside the CRM case list — no switching between screens
  • When an Engage alert fires on a student with an open case, it's added to the existing case timeline rather than creating a duplicate
  • Case workers can reference the Engage score in case notes — providing objective evidence for referral paperwork
  • The "Engage + No Case" CRM view surfaces students with significant score drops who have no open support case
  • After an intervention, monitor whether the score recovers over 4–6 weeks as a measurable outcome indicator
📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about using engage alongside the crm is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. Which statement best reflects the guidance in this lesson?

4 Score trends, peer groups, and interpreting patterns

A single weekly score is less useful than a trend. Three consecutive weeks of decline is a meaningful signal; a single week of low score followed by recovery is often noise. Distinguishing genuine disengagement from one-off anomalies (exam week, student holiday, system data gap) is a key practitioner skill.

  • The score trend graph in the student profile shows weekly scores for the academic year — hover over each point for the exact value and date
  • A consistent downward trend over four or more weeks is a strong signal; a single anomalous week is not
  • If both engagement score and attendance decline together, the signal is stronger than if only one source drops
  • A score drop in an entire cohort simultaneously suggests a shared cause — an exam period, a cancelled event, or a data gap
  • Peer group comparison shows how a student compares to a filtered subset of peers — useful for contextualising performance within a specific programme

Tip: When presenting Engage data to senior management, lead with trends rather than snapshots. "15% of Year 1 students have declining engagement over the last six weeks" is far more actionable than "our average score this week is 492."

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about score trends, peer groups, is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

Expert — SEAtS Engage

For administrators and power users — integrations, governance, and advanced setup.

4 lessons · Expert
1 Data source weighting, peer groups, and model configuration

The Engage scoring model is configurable at institutional level. Administrators can adjust source weights, define custom peer groups, and choose how the model handles missing data. These decisions have significant impact on score quality and should be made deliberately.

  • Source weights are in Engage → Config → Model Settings; default: attendance 40%, VLE 30%, library/door access 20%, other 10%
  • Adjust weights only after reviewing data quality scores — a low-quality source should carry lower weight
  • Custom peer groups can include cohort, entry qualification band, or study mode in addition to programme and year
  • Missing data handling: "exclude from calculation" reduces that week's score weight; "impute from prior weeks" carries forward the previous value
  • Document every model configuration change — configuration history is not stored in the audit log by default

Note: Model configuration changes affect all students' scores at the next weekly run. Brief welfare and personal tutor teams before making significant changes — an unexplained bulk score change generates a flood of tutor queries.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about data source weighting, peer groups, is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

2 Engage AI: predictive risk and early intervention signals

Engage AI extends the weekly score model with a predictive layer identifying students statistically likely to disengage in coming weeks — before their score has fallen to a standard threshold. The model is trained on your institution's historical data and improves in accuracy over each academic year.

  • Engage AI generates a "risk trajectory" for each student: Rising, Stable, Declining, or At Risk
  • At Risk fires based on current score, rate of decline, historical patterns, and data source signals combined
  • AI predictions appear as a trajectory banner in the student profile and as a filterable column in the dashboard
  • Model accuracy metrics are in Engage → AI → Model Performance — review quarterly
  • New students in weeks 1–8 show "Insufficient data" — the model requires 8 weeks of score history minimum

Tip: Use the "At Risk" AI filter as your first-call list for proactive outreach each week — it surfaces students before they reach a formal threshold, when early contact is most effective.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about engage ai: predictive risk is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

3 Integration architecture and data pipeline management

Engage aggregates data from multiple external systems. Each integration has its own technical characteristics, delivery schedule, and failure mode. Managing the pipeline — monitoring for failures and co-ordinating with source system owners — is an ongoing operational responsibility.

  • VLE integrations use API polling (Moodle, Canvas, Teams) or nightly SFTP (Blackboard, legacy systems)
  • Library and door access data typically arrives via flat-file SFTP from your access control or library system overnight
  • Each source has a data freshness indicator in Engage → Config → Data Sources — "Stale (>48hrs)" needs immediate investigation
  • Missing data for one week is handled gracefully; missing for three or more consecutive weeks degrades score quality significantly
  • Maintain a contact list of the technical owner for each source system — when a feed fails overnight, you need to know who to call
📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about integration architecture is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. Which statement best reflects the guidance in this lesson?

4 Reporting, governance, and student data ethics

Engage data raises important questions about student privacy, consent, and the ethical use of behavioural data. Institutions deploying Engage have a responsibility to be transparent with students about what is collected, how it is used, and what rights they have.

  • Inform students about Engage data collection via your institution's Privacy Notice — confirm with your DPO that the notice covers all active sources
  • Students have the right to access their own engagement data under UK GDPR — the platform supports this via the student portal and staff-assisted SAR exports
  • Configure data retention periods in Admin → Data → Retention Policy in line with your institutional student data retention schedule
  • Engagement data should never be used in academic decision-making without explicit institutional policy approval and student disclosure
  • Governance reports showing aggregate trends and intervention outcomes are generated from Engage → Reports → Governance Summary

Note: Before adding new data sources (gym access, campus Wi-Fi dwell time), consult your DPO. Additional sources may require a Data Protection Impact Assessment and an update to your student Privacy Notice.

📝 Quick Check Answer all 3 to earn your badge

Q1. Which of the following statements about reporting, governance, is correct?

Q2. Which of the following is accurate according to this lesson?

Q3. What is the recommended best practice highlighted in this lesson?

🏅

Badge Earned!

You completed all lessons in this level.