This subtopic covers conceptual understanding of the transport and processing of learning event data including:
- Data Collection Pipelines: Data pipelines begin with the capture of sensor data and through hardware and software sensors and tagging of learning records with contextual metadata and then transporting those event records to a learning record store.
- Processing and Transformation Pipelines: After collection, data pipelines process, filter, clean, validate and transform the raw (noisy) data to what is needed to support data uses. This typically involves multiple stages of processing such as:
- Noisy Learning Record Store (LRS): A noisy LRS is used to capture raw, unfiltered data directly from the learning environment. It stores everything, including irrelevant data points, which might be valuable for exploratory analysis.
- Transactional LRS: The transactional LRS processes data from the noisy LRS, filtering out unnecessary information and keeping only data relevant to specific learning events. This refined data can then be used for further analysis or decision-making.
- Advanced Data Refinement: After filtering, data can be further processed for inference and the generation of assertions regarding learner progress. This includes making inferences about learner competencies and identifying milestones achieved, which can contribute to issuing certifications or providing targeted feedback.