Skip to main content

Progress - Article

Progress Analytics shows role-based, unified progress across activities and modules—completion, time spent, certificates, and re-certification—via dashboards and exports for learners, instructors, and administrators.
Updated: 14 Mar 2026
27 min read

Summary

Progress Analytics provides a unified, role-based view of learner progress across activities, modules, events, and certifications. It enables participants, instructors, and administrators to monitor completion, time spent, certification status, and re-certification readiness. 

In this article you will learn:

  • How Progress analytics tracks completion and progress across learning activities
  • How administrators and instructors monitor learner status and outcomes
  • How progress data includes time spent, certification status, and re-certification readiness
  • How progress analytics supports operational oversight and compliance monitoring

Purpose and Scope

Progress is designed to answer one of the most fundamental questions in learning operations:

“Where do learners stand—right now—and what does that mean?”

This view combines progress status, time spent, completion logic, certification lifecycle data, and organizational context into a single navigable experience. Unlike isolated course views, it reflects the authoritative state of learning progress across native content, instructor-led activities, SCORM packages, xAPI content, events, and administrative actions.

It is particularly valuable for:

  • Participants monitoring their own learning journey
  • Instructors supervising activities they are responsible for
  • Course administrators managing delivery, attendance, and exceptions
  • Compliance owners and managers overseeing readiness and re-certification

Progress is attempt-aware, ensuring that progress, completion, and certificates always reflect the learner’s current enrollment or re-certification cycle, not historical attempts.

Role-Based Perspectives and Access

Progress visibility and available actions depend on role and responsibility:

  • Participants: See their own progress, time spent, completion status, certificates, and upcoming re-certifications
  • Instructors: See progress and attendance for activities they are responsible for, including participant-level drilldowns and attendance marking where enabled
  • Course Administrators: Access operational controls such as attendance marking, force completion, administrative comments, and participant oversight
  • Platform / Global Administrators: Have cross-organization visibility and full audit access, subject to organization-layer configuration

This role-based design ensures transparency without overexposure, and aligns with governance and data-access requirements.

How Progress Is Calculated

Progress & Statistics aggregates learning signals from multiple content and delivery types into a consistent progress model:

  • Native content (pages, videos, audio, H5P): Progress is based on learner interaction and completion logic defined in the activity
  • Instructor-led events: Progress and completion are driven by attendance marking
  • SCORM content: Completion, score, and time are reported via the SCORM runtime and reflected at activity and module level
  • xAPI content: Statements are recorded in the Learning Record Store (LRS) and surfaced at page, module, and activity level
  • Time spent: Accumulated across interactions but does not alone imply completion

This ensures that progress reflects actual learner behavior, regardless of content format or technical implementation.

Progress Views and Drilldowns (Activity-Centric Navigation)

Progress & Statistics is always anchored to a specific training activity. The view does not start from a global learner or program overview—instead, it answers the question:

“What is the progress status for this activity, and how does it look from my role’s perspective?”

From that shared entry point, the same activity data is presented differently depending on who is viewing it:

  • Learners see their own progress for the activity, including completion status, time spent, module-level breakdown, and certificates earned
  • Instructors see aggregated progress across participants they are responsible for, allowing them to assess learning momentum, identify learners who are falling behind, and manage delivery-related actions such as attendance marking
  • Administrators and managers see supervisory metrics across all authorized participants for the activity, including distributions of progress, deadlines, certifications, and time investment

Within an activity, navigation follows a consistent drilldown pattern:

  • Activity overview shows aggregated KPIs such as progress distribution, time spent, deadlines, and certification status (or individual status for learners)
  • Module-level views break the activity into its components—courses, events, assignments, existing training, H5P, SCORM, or other modules—showing how progress and time are accumulated
  • Participant-level views (where permitted) allow supervisors to drill into an individual’s progress within the activity, without losing the activity context
Module-level example showing a learning path with two standard self paced modules, viewed from a supervisory role.

Throughout this navigation, filters can be applied—such as date range, status, or module type—to refine what is shown. These filters adjust the metrics within the same activity rather than changing the scope to unrelated content.

This design ensures that all users—learners, instructors, and administrators—are always looking at the same activity, but through a role-appropriate lens that supports self-tracking, supervision, or operational oversight without fragmenting the experience.

Main Statistics Page Rubrics by Module Type and Role

The following is a quick-reference of what the main Progress page shows (“top rubrics”) by module type, split by Participant vs Administrator / Instructor, etc. (supervision).

Module type (activity structure)Participant view – what the top rubrics showAdmin / Instructor view – what the top rubrics show
Single Course module (native course)Personal progress through the course content. Rubrics summarize your engagement across the course objects, typically: Pages, Videos, Audios, Tests (shown when used), plus your overall status/progress for the activityAggregated course engagement and progress across participants. Rubrics focus on how the cohort interacts with the course objects: Pages, Videos, Audios, Tests (when used), plus cohort-level progress context (e.g., participation/progress distribution and totals, depending on configuration)
Single Event module (ILT/webinar)Your completion context for the scheduled session: your activity status (Not started / In progress / Completed), and attendance-related status when applicable (e.g., marked attended)Operational delivery overview across participants: cohort status distribution, and (where configured) attendance and webinar recording visibility to support delivery follow-up and compliance checks
Single Assignment module
(Submissions view)
Your submission lifecycle: whether you have submitted, and the outcome/feedback signals that apply to the assignment setupAssessment throughput overview: rubric-style visibility into the submissions state across learners and the administrative “work queue” nature of marking/review (with drilldown into submissions)
Existing Training module (ET)Your progress in the referenced/linked training. The main activity behaves like a learning-path style overview if the activity contains multiple modulesTwo-layer visibility: (1) the main activity shows the learning-path style overview when relevant, and (2) when opening the ET module statistics, admins see the statistics of the referenced activity, including signups that exist in that referenced activity (not only the “main” activity). The exact rubrics mirror the referenced module type (course/event/assignment/etc.)
Learning Path style (2+ modules)A single consolidated snapshot of your journey across modules: Not started / In progress / Completed plus overall Progress (designed to summarize a multi-module path rather than object-level detail)The same consolidated snapshot, but interpreted as a cohort-level dashboard: Participants plus Not started / In progress / Completed and overall Progress—used to quickly identify friction points and where learners are stalling across the path
SCORM course module (external package)The main page emphasizes SCORM completion signals rather than native object rubrics: high-level completion/progress for you, and SCORM outcome indicators (e.g., completion/pass state and score when reported by the package)A cohort-level summary of the SCORM package outcomes (completion/pass/score/time where reported). Because SCORM data depends on what the package reports, the visible “rubrics” skew toward attempt/outcome indicators rather than pages/videos/tests

Intent of This Behavior

The same Progress and Statistics page is designed to answer different questions depending on who is viewing it and how the activity is structured.

  • Participants use the page to understand their own status: “How am I progressing in this activity, and what do I need to do next?”
  • Administrators and Instructors use it for supervision and insight: “How are participants progressing overall, where are they getting stuck, and which parts of the activity require attention?”

What appears at the top level is determined by the module types included in the activity, including whether adaptive logic is used. Each module type exposes the rubrics that best reflect learning intent and progress:

  • Course modules surface object-level engagement (Pages, Videos, Audios, Tests) alongside overall progress and completion
  • Event modules emphasize attendance, webinar recording engagement (if enabled), and completion status
  • Assignment modules focus on submissions, assessment status, and review actions
  • Learning Path–type activities present a consolidated progress view across multiple modules and stages
  • Existing Training modules mirror the progress and statistics of the referenced activity or module
  • Adaptive modules dynamically reflect only the content paths that were actually triggered for each participant. This means:
    • Optional or conditional modules appear in progress views only when activated
    • Progress and completion reflect relevance, not a fixed curriculum
    • Supervisors see aggregate outcomes shaped by adaptive decisions, not artificial “missing” steps

As a result, the page always reflects the real learning journey, not just the theoretical structure. Progress, completion, and time spent are shown in the context of what learners were expected to do, based on role, configuration, and adaptive design.

This ensures the same interface remains meaningful for personal orientation, instructional follow-up, and administrative oversight—while accurately representing adaptive learning behavior.

Participants Section — Columns and Actions by Module Type

The Participants section is always scoped to a single activity. Columns and available actions adapt dynamically based on the module type(s) used in that activity and the viewer’s role.

Drill-down view of a single self-paced course module with certification enabled and multiple participants.
Drill-down view of a single self-paced course module with certification enabled and multiple participants.
Module TypeColumns ShownActions & Special Indicators
Single Course Module
  • Name – User avatar, full name, and email address
  • Pages – Progress on course pages (if pages are used)
  • Videos – Video completion indicators (if videos are included)
  • Audios – Audio completion indicators (if audios are included)
  • Tests – Test completion indicators (if tests are included)
  • Status – Not started / In progress / Completed
  • Certificate – Download certificate button (if configured)
  • Actions (⋮) – Mark attendance / Revoke attendance
Single SCORM Course Module
  • Name – User avatar, full name, and email address
  • Completion – SCORM completion status (Unknown / Complete)
  • SCORM status – SCORM success status (Unknown / Passed)
  • Score – Final score percentage (if provided by SCORM package)
  • Total time – Total time spent in the SCORM module
  • Status – Not started / In progress / Completed
  • Certificate – Download certificate button (if configured)
  • Actions (⋮) – Mark attendance / Revoke attendance
Single Event Module
  • Name – User avatar, full name, and email address
  • Status – Not started / In progress / Completed
  • Attendance – Not marked / Attended
  • Webinar recording – Viewed / Not viewed (if recording is configured)
  • Certificate – Download certificate button (if configured)
  • Actions (⋮) – Mark attendance / Revoke attendance
Assignment Module 
(Section labeled Submissions)
  • Name – User avatar, full name, and email address
  • Submitted – Submission status
  • Assessment – Assessment result or score (if applicable)
  • View submission / Lock / Unlock
Existing Training Module
  • Name – User avatar, full name, and email address
  • Status – Not started / In progress / Completed
  • Certificate – Download certificate button (if configured)
  • Actions (⋮) – Mark attendance / Revoke attendance
Learning Path
(Multiple modules)
  • Name – User avatar, full name, and email address
  • Status – Not started / In progress / Completed
  • Certificate – Download certificate button (if configured)
  • Actions (⋮) – Mark attendance / Revoke attendance
Event Module — Webinar Recording
(When added)
  • Webinar recording status – Viewed / Not viewed

This status is visible both:

  • In the Participants section of the activity
  • On the individual Participant’s Progress page
 

Notes and Interpretation

  • Columns appear only if the corresponding module exists
  • Assignment modules intentionally replace Participants with Submissions
  • Webinar recording status is tracked separately from attendance
  • Certificate and attendance actions are consistently available where configured

Individual Progress View - Details

When individual progress is viewed by supervisory roles, each participant can be reviewed in detail—providing context behind aggregated results and enabling deeper assessment of learning behavior and outcomes. The available insights vary by module type, as different content elements generate different metrics.

For natively authored course modules, this view offers especially rich insight, including detailed engagement data such as page and video views, assessment attempts and answers, and document downloads within the course.

Example showing a natively authored course module in detail view, with options to analyze course-level engagement for an individual participant (re-certification by attempt).
Example showing a natively authored course module in detail view, with options to analyze course-level engagement for an individual participant (re-certification by attempt).

How Progress and Completion Are Calculated

Progress and completion are rule-driven, not cosmetic. An activity is only marked Completed when its configured completion conditions are met. These conditions depend on three key factors:

  • Module composition (course, event, assignment, learning path, etc.)
  • Scheduling (whether an end date exists)
  • Certification and/or attendance requirements

This ensures that completion reflects real learning and compliance intent, not just content access. Core completion principles at high level:

  • Progress reflects how much of the required learning journey has been completed
  • Completion reflects whether all mandatory conditions have been satisfied
  • Certificates, schedules, and attendance rules always take precedence over simple content consumption

Completion Logic by Activity Configuration

Activities with Multiple Modules (Learning Paths) at the Activity Level.

ConfigurationWhen the Activity Is Marked Completed
No schedule, no certificateWhen all required modules/content are 100% completed
Schedule, no certificateOnly after the end date has passed, and required modules are completed
No schedule, certificateWhen the certificate is issued
Schedule and certificateOnly when both conditions are met: certificate is issued and end date has passed

Single-Module Activity Completion Logic

This is a single course module (standalone or within a Learning Path).

ConfigurationCompletion Trigger
Certificate enabledCertificate issued
No certificate, scheduledEnd date has passed
No certificate, no schedule100% content consumed
Certificate and scheduleCertificate issued and end date has passed

Event Module (standalone or within a Learning Path)

ConfigurationCompletion Trigger
Certificate enabledCertificate issued
Attendance marking enabledAttendance marked
No certificate, no attendanceEnd date has passed
Certificate and scheduleCertificate issued and end date has passed

If an event:

  • Has no certificate
  • Has attendance marking disabled
  • Has an end date

Then the activity is marked Completed after the end date passes, provided any other modules in the activity are completed. This follows the same logic as other scheduled, non-certified activities.

Administrative Overrides: Force Completion and Governance

In certain operational scenarios, instructors and administrators may need to intervene manually to resolve progress or completion states that cannot be handled automatically.

Force completion allows authorized users to mark an activity—or a specific module within an activity—as completed when normal tracking is insufficient or inappropriate. This provides necessary operational flexibility while maintaining governance, traceability, and compliance.

Where Force Completion Can Be Performed

Force completion can be initiated from several places in the platform:

  • Progress (Statistics) page
  • Activity details page → Participants (Activity signup list)
  • Participants (Activity signup list) as a single or bulk operation

The action can be applied to:

  • The entire activity, or
  • A specific module within the activity

This allows administrators to resolve issues at the correct level of granularity.

Force Completion — Governance and Behavior Overview

AreaDescription
When Force Completion Is Typically Used

Force completion is intended for exceptional cases, such as:

  • Attendance has been verified outside the platform (e.g. physical sign-in sheets, external systems)
  • Technical issues prevented normal progress or attendance tracking
  • Exceptional business, contractual, or compliance-related decisions require administrative resolution

 

How the Force Completion Process Works

When selecting Mark as completed, the administrator is presented with a modal dialog where they can:

  • Add an optional completion comment explaining the reason
  • Choose whether to notify the participant by email (This option is available only when no certificate is attached to the activity)

Once confirmed:

  • The training status is updated immediately
  • If a certificate is attached to the activity or module, it is generated automatically
  • Required training feedback is bypassed—the certificate is still issued
  • If a certificate was later replaced with a different one, the new certificate is shown to the user

 

Audit Trail and Reporting

Every force completion action creates a permanent audit trail:

  • The user who performed the action
  • The timestamp
  • The optional completion comment

This information appears in:

  • Progress report (columns Marked completed by and Completion comment)

Comments are not editable after submission, ensuring integrity of audit data.

Option Availability and Governance Rules

The availability of Mark as completed and Revoke completion depends on participant status:

Allowed for:

  • Registered (active)
  • Cancelled
  • Expired enrollments

Not allowed for:

  • Reserved
  • Waiting list
  • Blocked
  • Pending users

For unsupported statuses:

  • The actions are hidden
  • In bulk operations, those signups are ignored
  • An error is shown if only unsupported signups are selected

Additional rules:

  • Mark as completed is only available if the participant has not already completed the activity
  • Once applied, Revoke completion becomes available
  • Revoke completion is not available for participants who completed the activity themselves

 

Notifications and Emails

When force completion is applied, participants receive an automated notification:

  • If a certificate is attached → “Certificate was achieved” email
  • If no certificate is attached → “User completed training” email

These templates are managed under Settings → Translations → Emails.

Progress, Percentage, and Data Integrity

Force completion affects status visibility—but preserves learning data:

  • Progress percentage is set to 100%
  • Activity status becomes Completed
  • Underlying content states (pages, videos, H5P, SCORM, events, tests) remain unchanged in analytics
  • On learner-facing views (My Dashboard, Learn mode, LP overview), the user sees Completed / 100%

If force completion is applied to modules used as access restrictions, it immediately unlocks restricted content.

Re-certification and Attempts

Force completion applies only to the current attempt:

  • In re-certification scenarios, a new attempt is created when required
  • The forced completion remains on the previous attempt
  • The new attempt starts with no progress, ensuring re-certification integrity

 

How Time Spent Is Tracked

Time spent represents the actual, active time a learner spends engaging with a course while it is in focus. The platform is deliberately strict in how time is counted to ensure accuracy and avoid inflating engagement metrics.

ScenarioHow Time is Counted
Core Tracking Logic

Time is counted only while the course page is actively in focus and the learner is interacting with it.

This means:

  • Time is tracked when the browser tab containing the course is active
  • Time pauses immediately when the learner switches to another browser tab, application, or window
  • Time resumes once the learner returns to the course tab

If the browser is open but the learner interacts with other desktop elements (e.g. clicking outside the browser), time tracking is suspended until the learner clicks back into the course page.

Video, Audio, and Assessments
  • Time spent watching videos and completing tests is counted, including when videos are played in full-screen mode
  • Background playback does not count:
    • If the browser is minimized
    • If another tab is active
    • If the device switches to background or picture-in-picture mode
  • Audio or video files opened via a file widget are not counted toward time spent

On mobile devices:

  • Time is not counted in background mode, picture-in-picture, or task-switcher views
System States and Connectivity

Time tracking responds intelligently to system and network changes:

  • If the computer enters sleep mode, locks, or the screen turns off, time tracking is paused
  • After unlocking or waking the device, the learner should click or scroll on the course page to resume tracking
  • Switching Wi-Fi networks or from wireless to Ethernet does not interrupt time tracking
  • Connecting to or disconnecting from a VPN does not affect time tracking
  • If the internet connection drops:
    • Time continues to be tracked locally
    • Time is synced when the connection is restored
    • If the learner closes the course tab while offline, unsynced time is not added

abc

Multiple Devices and Tabs

Time spent is not cumulative across parallel sessions:

  • Time is not combined across:
    • Multiple browser tabs
    • Different browsers
    • Different devices (e.g. PC and smartphone)
  • Only the actively focused session contributes to time spent
External Navigation
  • Following an external link pauses time tracking
  • Time resumes only when the learner returns to the course tab
Known Limitations
  • xAPI courses currently do not support time spent tracking
  • Event attendance alone does not generate time spent unless recordings are actively viewed

H5P in Progress, Analytics, and Reporting — Overview

H5P objects in Eurekos are treated as interactive learning and assessment components that can drive progress, restrict access, generate attempts, and produce detailed analytics and reports. How an H5P object behaves depends on its type, configuration, and placement within a course or activity.

  • H5P objects contribute to page and activity progress once their completion trigger is met (for example clicking Finish, Check, or completing required interactions)
  • Completion triggers vary by H5P type (e.g. viewing all pages, answering questions, clicking a button)
  • Progress events are also sent to any connected Learning Record Store (LRS), ensuring consistency across statistics and reporting.
  • Different H5P configurations affect tracking behavior, completion logic, and attempts

H5P Progress, Attempts, and Reporting — Quick Reference

AreaWhat It CoversKey Details
H5P Objects Shown as “Tests”Which H5P objects appear under Tests on the Statistics page

Includes assessment and interactive types such as: Multiple Choice, Single Choice Set, True/False, Drag and Drop, Drag the Words, Fill in the Blanks, Mark the Words, Question Set, Summary, Arithmetic Quiz, Memory Game, Flashcards, Interactive Video, Course Presentation, and SCORM / xAPI. 

Only specific technical H5P types contribute to test-level analytics and attempts.

H5P Attempts in AnalyticsHow multiple attempts are tracked and reviewed
  • Attemptable H5P objects record multiple attempts
  • Administrators, instructors, and authorized roles can review attempts via See details on the Analytics page
  • Each attempt is stored separately and contributes to analytics and reports

Supported attempt-based H5P types include: Question Set, Multiple Choice, Drag & Drop, Fill in the Blanks, Mark the Words, Single Choice Set, True/False.

Save State for H5PResume unfinished H5P interactions

When enabled (via Eurekos Service Desk), Save Content State automatically remembers a learner’s progress inside H5P objects:

  • State is saved only when users access content through proper learning flows (Learn mode, My Trainings, Analytics → Go to course)
  • State is not saved when opening H5P directly via URL or list views
  • Learners and instructors can resume unfinished interactions
  • A Clear H5P state action allows users to reset progress on a specific course page

Resetting state does not remove previous attempts—new attempts are recorded separately.

H5P Answers ReportsDetailed reporting on user responses

For supported H5P types, administrators and instructors can generate Answers Reports, showing:

  • Question text
  • Learner responses
  • Correct / incorrect status (where applicable)
  • Scores and timestamps

Reports are available for types such as Question Set, Multiple Choice, Single Choice Set, Fill in the Blanks, Mark the Words, True/False, Essay, and Drag Text.

Course Presentation & Interactive VideoSpecial completion behavior
  • If no interactive elements are added, these objects behave like viewed content
  • If interactions are added, they behave like tests
  • Interactive Video must be watched until the end to complete—this is not configurable
  • Each interaction inside the object is tracked separately
“Give User Another Try”Administrative remediation option

When an H5P object has an attempt limit enabled and the learner fails all attempts:

  • Authorized roles can grant an additional attempt via Give user another try
  • This preserves previous attempts while allowing a new one

Supported roles include Administrators, Course Administrators, Instructors, Support, Assistants, and Partners.

SCORM Progress Tracking and Administration

SCORM remains a widely used industry standard for packaged learning content, particularly for legacy courses, externally authored content, and compliance-driven training. In Eurekos, SCORM, xAPI packages, and H5P-based SCORM objects (embedded SCORM objects on a page versus full page) are treated similar to SCORM object for progress tracking, statistics, and reporting—while still respecting their technical differences.

This section explains how SCORM progress is tracked, how it appears in Analytics, and which administrative controls are available to ensure reliable outcomes, governance, and auditability.

How SCORM Progress Is Tracked

SCORM progress tracking depends on signals sent by the package itself and on platform-level configurations.

  • SCORM objects track progress after completion or interaction, depending on how the package is authored
  • Completion may be triggered by:
    • Navigating through all pages
    • Passing an embedded assessment
    • Reaching an internal “completed” or “passed” state
  • Progress data is tracked:
    • On the Eurekos platform (for Analytics and Progress views)
    • In a connected LRS (where applicable)

Unlike native course modules, SCORM progress is not tracked page-by-page. Instead, progress appears only once the package signals completion or pass/fail.

SCORM Objects in Analytics

SCORM TypeHow It Appears in StatisticsKey Notes
SCORM CourseShown as a single module with status, score, timeUses SCORM and platform statuses
H5P SCORMShown under TestsRequires completion or test pass
H5P SCORM BubbleShown under Tests (if used this way)Full-page SCORM experience
xAPI Course / H5P xAPIProgress returned from LRSLimited platform-side tracking

Definitions

  • A SCORM course can be imported as a standalone course module, containing all pages, quizzes, videos, and assessments within the SCORM package itself
  • A SCORM package can also be imported into the media library as an H5P SCORM object, which provides two embedding options:
    • The H5P SCORM object can be embedded within a course page using the native authoring tools
    • The H5P SCORM object can be embedded as a full-page experience using the native authoring tools. This is visualized as a SCORM Bubble. Completion logic is tracked identically to standard SCORM objects

SCORM Status Model

SCORM-based activities expose three parallel status dimensions:

Status TypePossible ValuesPurpose
Completion Status (Eurekos)Unknown, CompletedContent completion
SCORM StatusUnknown, PassedAssessment success
Platform StatusNot started, In progress, CompletedLearner progress in Eurekos

Certificates upon SCORM completion depend on receiving the Passed state for SCORM. However, some packages do not send the Passed state, so Eurekos can force Passed state when SCORM packages doesn't send it. This ensures appropriate analytics and actions

Key Standard SCORM Behaviors

Behavior and optionsWhat It Affects
Update scoreLatest score always overwrites previous scores, even if it is lower.
Resume Behavior (Suspend Data)

SCORM packages often support “resume where you left off” functionality. This is supported by default.

  • When re-uploading a SCORM package Eurekos clears suspend data automatically
  • When a sign-up expires suspend data is cleared

These standard configurations prevent mismatched resume states.

Reset Progress (Administrative Control)

Reset SCORM progress for individual users:

  • Clears: Completion status, Score, Time spent, Attempts
  • Recalculates: Module status, Activity-level progress
  • Preserves: Previously issued certificates (remain valid). In the report Training Status and Completion time are not updated after reset, if there is a certificate issued

Authorized roles: Administrators, Course Administrators, Instructors responsible for the activity and Support role.

SCORM and Certificates
  • Certificates are issued only after SCORM reaches Passed
  • After resetting progress: Certificates remain valid and Status does not revert to Not started

This ensures certificate integrity, even when progress data is administratively corrected. This is the standard configuration.

Common SCORM Pitfalls — and How to Avoid Them

SCORM content is powerful but highly dependent on how packages are authored and configured. The following pitfalls are among the most common causes of inconsistent progress, failed certifications, and administrative confusion—and can usually be prevented with the right platform settings and governance practices. 

Reach out to Eurekos Service Desk if your use case requires a different approach or you have specific preferences. These include:

  • SCORM Course completes but never passes—may require a forced state configuration (if not enabled)
  • After a SCORM package is updated, learners resume on an incorrect slide—clear suspend data after re-uploading must be enabled (default configuration)
  • Scores appear lower than expected—avoid it by having Update score configuration option not to update with latest score. This will record the highest score obtained
  • Progress seems to “disappear” after enrollment expires for an enrolled user—review expired signup settings based on whether learners are expected to to resume content after reactivation

How to Validate SCORM and xAPI Tracking

When troubleshooting SCORM or xAPI progress, always validate the content outside the LMS first.

For SCORM packages, upload the file to SCORM Cloud to confirm that the package itself reports completion and/or passed statuses correctly. This helps identify corrupted packages or incorrect completion logic before adjusting platform settings.

For xAPI content, verify that the package is truly xAPI-based and then check tracking directly in the Learning Record Store (LRS). Since xAPI progress is not tracked natively in the platform, the LRS is the source of truth for completion and score data.

Testing content in this way isolates package behavior from platform configuration—saving time and preventing misdiagnosis.

Permissions, Data Access, and Organization Layer

Progress Analytics and the Report is governed by role-based permissions and the organization layer. Users can only see data they are authorized to access based on their role, organizational affiliation, and scope of responsibility.

In practice:

  • Data visibility is limited to permitted organizations, activities, and entities
  • Parent organizations can see aggregated sub-organization data; sub-organizations cannot see upward or sideways
  • Blocked users remain visible for historical accuracy; deleted users are excluded for privacy compliance; Cancelled and expired enrollments remain visible for audit and traceability
  • The same rules apply consistently to both on-screen analytics and exported reports

This ensures secure, consistent, and audit-ready access to data across the platform.

Progress Report

The Progress Report provides a comprehensive, exportable view of learner progress and completion across activities and modules, designed for operational oversight, compliance documentation, and downstream analysis. While the Progress Analytics views focus on real-time visibility and drill-down navigation, the Progress Report is optimized for structured evidence, audits, and cross-system use.

It consolidates enrollment, progress, completion, certification, and organizational context into a single dataset—making it a core reporting tool for administrators, managers, and auditors.

The Progress Report is designed to answer questions such as:

  • Who is enrolled, in progress, completed, or overdue?
  • When did learners enroll and complete training?
  • Which activities, modules, and certifications are involved?
  • Can we document progress and completion for audits or compliance?
  • How does progress differ across organizations, instructors, or programs?

Report Configuration Overview

When generating a Progress Report, administrators configure how progress data is selected and interpreted. The report is context-aware and supports multiple filtering strategies depending on the operational question being answered.

Users can generate the report for:

  • All activities
  • One or multiple specific activities

Scheduled activities display their dates directly in the selector. A configuration option controls whether cancelled activities are available for selection .

Report Filtering Logic

The report supports three mutually exclusive filtering perspectives. Each determines how time relevance is applied:

Filter ModeWhat It RepresentsTypical Use Case
EnrollmentUsers enrolled during the selected periodIntake analysis, onboarding volume, admissions audits
CompletionUsers who received Completed status during the periodCertification evidence, compliance reporting, and can distinguish between all completions, certified completions and completions without certification
Activity ScheduleActivities whose scheduled dates fall within the periodInstructor-led training, events, calendar-based reporting

For scheduled activities, the report includes activities whose start and end dates intersect the selected window. Self-paced activities can be included in the report using the Activity schedule filter. When Include activities without schedule is enabled, the report includes:

  • Activities that start and end within the selected date range
  • Activities without any schedule
  • Activities with only a start date or only an end date
Comprehensive filter options with contextual dependencies enable highly targeted, relevant progress reporting using either templates or custom configurations.
Comprehensive filter options with contextual dependencies enable highly targeted, relevant progress reporting using either templates or custom configurations.

Progress Report – Data Fields (Contextual Reference)

The Progress Report returns one row per user × activity × module relationship. Assignment modules are intentionally excluded.

  • Activity-level progress and module-level progress are stored separately
  • Module order in the report matches the module order in the activity

It contains a comprehensive set of fields that combine learner identity, organizational context, activity structure, progress logic, and certification lifecycle data. The exact columns available depend on platform configuration, module types, and platform governance settings. Below is an overview of commonly included fields and their meaning.

Customization options are the most comprehensive of all reports, reflecting this report’s role as one of the platform’s central reporting tools.
Customization options are the most comprehensive of all reports, reflecting this report’s role as one of the platform’s central reporting tools.
Data CategoryExample Fields IncludedWhat This Category Represents
Participant IdentityFull name, Email, User ID, External IDIdentifies the learner associated with the progress record. External IDs may reflect HR, CRM, or partner systems where configured.
Organizational ContextOrganization, Sub-organization, Company, Country, Department, Job functionThe organizational structure and profile metadata associated with the learner at the time of enrollment or progress tracking. Visibility depends on organization-layer configuration.
Activity ContextActivity title, Activity type, Activity ID, Activity statusIdentifies the training activity (course, event, learning path, etc.) the progress record belongs to.
Module ContextModule type, Module title, Module statusDescribes which module within the activity the progress relates to (e.g. course module, event, assignment, SCORM, existing training).
Enrollment & Signup DataEnrollment date, Signup status, Registered by, Order typeCaptures how and when the participant was enrolled, including manual or automated enrollment paths.
Progress & CompletionProgress %, Status (Not started / In progress / Completed), Completion dateRepresents the learner’s current or final completion state, calculated according to activity and module rules.
Time & EngagementTotal time spent, Time spent during periodIndicates the amount of active learning time recorded for the participant within the selected reporting scope.
Assessment & PerformanceScore, Pass/Fail, Test attemptsAssessment outcomes where applicable, including results from tests, SCORM packages, or H5P objects.
Attendance & EventsAttendance status, Webinar recording viewedTracks attendance-based modules such as events or webinars, including recording engagement where enabled.
Certification & CredentialsCertificate issued, Certificate expiration, Re-certification dateCaptures credential outcomes linked to the activity or module, including expiration and re-certification logic.
Administrative ActionsMarked completed by, Completion comment, Forced completion flagRecords manual administrative interventions such as force completion, including audit-relevant metadata.
Scheduling & ValidityActivity start date, Activity end date, DeadlineProvides scheduling context that affects progress calculation, completion timing, and compliance interpretation.
Commercial ContextPrice, Transaction ID, Order referenceIncluded when progress reporting intersects with paid enrollments or commercial tracking.
Reporting MetadataReport scope, Date range, Worksheet contextDescribes how the report was generated, including filtering scope and worksheet separation logic.

Data Relevance and Freshness

The Progress Report is backed by a dedicated aggregation table optimized for performance. Data updates automatically when:

  • User profile data changes
  • Enrollment status changes
  • Progress or completion updates occur
  • Activity or module configuration changes

For exceptional cases, Eurekos Service Desk can trigger a manual recalculation to fully rebuild progress data upon your request.

Localization and Delivery

  • Column headers are translated into the report generator’s language
  • All timestamps respect the user’s configured timezone
  • Date formats follow Excel locale settings
  • Large reports are queued and delivered by email to prevent system overload 

Handling Blocked, Deleted, and Cancelled Data

Progress reporting respects platform governance settings:

  • Blocked users can be included or excluded based on configuration
  • Deleted users can be included or excluded based on configuration
  • Registered and Expired signups are always included
  • Cancelled signups are included only if the corresponding config allows it 

This ensures flexibility between historical accuracy and privacy or compliance requirements. Contact Eurekos Service Desk for changes to standard behaviors on platform

Permissions, Data Access, and Organization Layer

Progress Analytics and Report is governed by role-based permissions and the organization layer. Users can only see data they are authorized to access based on their role, organizational affiliation, and scope of responsibility.

In practice:

  • Data visibility is limited to permitted organizations, activities, and entities
  • Parent organizations can see aggregated sub-organization data; sub-organizations cannot see upward or sideways
  • Blocked users remain visible for historical accuracy; deleted users are excluded for privacy compliance; Cancelled and expired enrollments remain visible for audit and traceability
  • The same rules apply consistently to both on-screen analytics and exported reports

This ensures secure, consistent, and audit-ready access to data across the platform.

When to Use Progress Analytics vs Progress Report

Use CaseRecommended Tool
Daily supervision and prioritizationProgress Analytics
Individual learner follow-upProgress Analytics
Compliance documentationProgress Report
External auditsProgress Report
BI or data warehouse integrationProgress Report

Used together, they provide situational awareness and formal accountability—without duplication or ambiguity.

FAQ