top of page
Using Laptop at Home

DIGITAL LITERACY ASSESSMENT

This computer skills assessment tests potential learners' ability to successfully navigate an online learning environment. The task-based, performance format is "cheat-proof", serving as both assessment and tutorial. Users who seek to memorize answers through brute force will find themselves mastering the computer skills they need to be successful in a virtual classroom.

The project sample is a shortened, 5-item version of the full, 23-item assessment. 

PROJECT
OVERVIEW

PURPOSE

Ensure learners starting VILT courses have the basic computer skills needed to focus on learning

Increase, rather than decrease, access to training

CLIENT & AUDIENCE
  • Non-profit employment & training agency

  • ​Participants in work programs

  • Case Managers in work programs

MY ROLE

Design

Development

SME

Implementation

Voice Talent: Trevor Stieg, Skills Instructor Team Lead, Forward Service Corporation

TOOLS

MS PowerPoint

Diagrams.net

Adobe Captivate

SnagIt

SCORM Cloud LRS

MS Power Automate

PROJECT DETAILS

THE NEED

A non-profit employment and training organization offers several virtual instructor-led courses to learners with a wide range of computer literacy skills. In many cases, learners’ have low confidence and skill in using computers, which makes learning in an online environment difficult.

 

To address this, registrants must pass a digital literacy assessment as a prerequisite. For years, the organization used a series of assessments from a third party, but those assessments didn’t quite meet their needs. They wanted something in-house and custom to meet the exact needs of their learners and streamline access to online training.

THE SOLUTION

The solution, built in Adobe Captivate, is part simulation, part assessment, and part tutorial. Users demonstrate their computer skills by performing a series of common computer tasks (like opening a browser) in a simulated desktop, contained within a single assessment. The tasks are limited to those most critical for navigating the online learning environment, and the assessment is hosted on the organization’s own domain.

 

Importantly, scoring is tracked internally with instant email notifications sent upon completion. The module is xAPI-enabled and SCORM compliant.

THE PROCESS

ANALYSIS

I started by surveying the entire training team, asking which computer tasks were the biggest stumbling blocks for learners. A small workgroup pared down the computer skills “wish list” to a short list of the most critical skills for making it to class and following directions.

 

Some classes required more computer-based tasks, so we created two lists of skills: one for the more technically involved courses and one for the courses that didn’t require as much tech savviness.

 

Learners who did not have needed computer skills often made it into class by retaking the assessments continuously until they memorized enough multiple choice answers to pass. They could pick out browser icons in a group of icons, but they could not open an internet browser on their computer. To prevent this, I created a performative, task-based assessment that requires learners to demonstrate they have the skills needed.

 

DESIGN

The task-based format meets two important needs. First, it directly assesses the required computer skills. Rather than assessing knowledge associated with performing those skills with multiple choice questions, it tests learners’ ability to perform them. 

 

Second, the task-based format serves as both assessment and tutorial. If learners take the assessment repeatedly, they aren’t simply memorizing that the answers are A,C,C,B,A. They are memorizing the clicks and keystrokes for performing the tasks… They’re learning and mastering the computer skills they need. 

 

In a sense, the task-based design allows potential learners to “fail up”. This ensures learners in class have the skills they need while helping more (rather than fewer) learners get access to the class.

 

To further prepare learners for class, I structured the assessment items to match the flow of tasks learners complete to prepare for and join class. Many tasks reasonably flow from one to the next, mimicking the actual application of those skills. I used flowcharts in the Diagrams.net app to demonstrate proof of concept to the team and collaborate on the details of navigation. 

 

Prior to development, I created an interactive prototype in PowerPoint. This gained buy-in from the team and helped me gather feedback on user interface.

 

In addition to the simulation, the interface includes a black frame along the top and right sides of the screen for instructions and control buttons. Most UIs have menus or navigation along the top, left, or bottom, so I placed the assessment controls along the right to further differentiate them from the simulation. They are visible and accessible but unlikely to be mistaken for assessment inputs.

DEVELOPMENT

I developed this assessment in Adobe Captivate, using the software simulation tool to record screens and workflows. I started by recording and developing a few simple tasks to ensure the method would work and find the most efficient workflows. 

 

After recording, I used SnagIt to simplify the screen captures and hide personal information from demonstration accounts (e.g. Gmail). I added transparent smartshape buttons for alternative correct answers (e.g. the X vs. File > Close in a Windows program) and for potential incorrect clicks. I did this to avoid tasks being marked incorrect when a user clicks white space or an object that would take them somewhere the simulation did not go. This also allowed me to program an “incorrect” feedback tag when they did so and to track incorrect clicks.

 

Due to all of the customization, I did not use preset question slides, but I still utilized the built-in quizzing and reporting function by scoring smartshape buttons that appear with feedback. This allows for xAPI statements to be programmed for specific actions at a later time and for regular SCORM reporting to function, making the assessment more portable for the future.

 

I conducted several rounds of user testing with staff and clients. I collected feedback in a Microsoft Form. Based on user feedback, I added additional correct “answers”, audio instructions along with static on-screen text for task instructions, and a report at the end that shows learners which tasks they performed correctly and which ones they did not.

 

It was critical that staff were notified automatically and immediately when their client had completed the assessment. I accomplished this with a JavaScript action that triggers a flow in Power Automate, which sends an email with the learner’s score to the learner and their Case Manager. Some learners will enter their Case Manager’s email incorrectly, so the flow includes a fail-safe email to the Training Manager, who can follow up with the client.

 

We considered hosting the assessment on the LMS for reporting. However, that requires user accounts for every client, many of whom will never take training. It also requires clients to get a username and password, then navigate to the LMS, then log in, then navigate to the assessment, and then complete the assessment - many steps that create snags for an audience with many barriers. 

 

Instead, the assessment is hosted on the organization’s training website, so clients can access it quickly and easily via a link.

IMPLEMENTATION

At the time of this writing, this project is still in the implementation phase. The assessment is accessed with a link available on the course registration form. Thanks to the widespread staff involvement in user testing, many Case Managers are already aware of the assessment and excited to use it. 

EVALUATION & ITERATION

My recommendations for evaluation include tracking changes in the number of clients registered and speaking with instructors about the preparedness of learners. There is also a link inside the assessment for users to report bugs and malfunctions. After a year of use, feedback and bug reports can be used for a targeted revision. 

  • LinkedIn

©2020 by Kaylee O'Connell. Proudly created with Wix.com

bottom of page