Make the Most of Your eLearning Review

Every month, B Online Learning facilitates Articulate and eLearning workshops around Australia. Lately, some of our participants have been asking questions around the review process and how to make it more efficient. For some people this can be a difficult, frustrating and time consuming process.

During the review phases it’s important that we check for:

  • Accuracy
  • Consistency
  • Functionality

Whatever instructional design model you use, it’s crucial that reviews/evaluations are a regular feature. This ensures that there are no nasty surprises at the end for the client or other key stakeholders.

Some of the advice I give to the Master eLearning Course includes:

  • Decide on how many reviews will be at each stage of the project
  • Decide on what the desired outcome is after each review cycle e.g. upon completion of the storyboard, you will need to proofread everything and check that all your content is accurate and up to date. If it’s at the end of the chunking/storyboarding stage, do you get your Subject Matter Expert into confirm that content meets the learning objectives.
  • Who will participate in each review phase? During the User Trial in the LMS, do you get a potential/past learner to review the course? It’s useful to get this clarified during the Definition phase of the project to make sure that all trial participants will be available.
  • How long will these reviews take? I always feel the shorter the better. If users have a long extended period to provide feedback, chances are they’ll forget and it will go to the bottom of the email pile. It’s worthwhile having a Project Planning document that covers key milestones and dates. This should be completed during the Definition phase of the project.
  • How will you prepare and support the participants with these reviews? Will you follow up with a phone call or will all the feedback be gathered online. Some of our recent Master eLearning students preferred the option of meeting their trial group face to face and gathering feedback that way.
  • Will you use a structured list of questions to gather feedback or a template so that the trial group can add their own comments? An example of both options has been added below.
    1.  Does the course launch OK? (Y/N)
    2.  Does the course navigation work OK? (Y/N)
    3.  Are all links within the course working? (Y/N)
    4.  Are graphics appearing in the right place?
    5.  Are there any typos? (if Yes, please give description and location of error)
    6.  Accessibility compliance (Y/N)
    7.  Are bullets, numbers, and capitals in line with style guide? (Y/N)
    8.  Does the sound/video work properly? Y/N (if applicable)
    9.  Does the bookmarking work (if applicable)? Y/N
    10.  Is the feedback to question options appearing for all correct and in corrects? Y/N (If No, please give description and location of error)
    11.  Do branching questions arrive at the correct destination?
    12.  Is the number of question attempts set correctly?
    13.  Is the scoring in tests/quizzes/assessments correctly recorded? (Y/N)
    14.  Are all hotspots working? (Y/N)
    15.  If Download option available, does the course download successfully? (Y/N)Does the downloaded course launch offline correctly? (Y/N)

In conclusion, it’s worthwhile taking some time in the ‘Definition Phase’ of the project to plan your review cycles. Reviewing and trialling your eLearning resource is an important part of the development process. Effective reviews and user trials can identify areas requiring improvement and help to ensure that your eLearning resource will provide a meaningful learning experience for your learners.

Upcoming Workshops
Tue 21
Oct 12