Start your course on the right foot by running through these key quality checks.
As you get ready to launch your course, you may feel tempted to push through and let it loose in the wild so that you can start collecting feedback. In general, we support that instinct. Beta tests are important, and there’s nothing like user feedback to show you where your course needs improvement. It’s usually better to put your course in front of people to see what they say than hold it back indefinitely because it isn’t yet perfect.
That said, for the sake of your course (and out of respect for your beta testers), you should give your course a thorough quality check before putting it in front of actual users. While beta tests are all about finding the flaws in your course, they can still go badly if the course is so riddled with errors that it’s unusable. Even the most earnest beta tester could walk always feeling like your course was a waste of time.
And once your course is launched in full, there’s no way to undo the bad impression caused by a flawed course design or course content riddled with errors.
So, while some mistakes are bound to slip through, it’s still of the utmost importance that you run your own pre-launch quality checks to keep them to a minimum. Here’s where to start.
1. Ask someone else to proofread your course.
Everyone makes proofreading errors. And for the most part, it’s not a matter of intelligence or laziness, it’s just how our brains work. After staring at your own writing for hours on end, you become blind to your own mistakes. You can read a piece of writing twenty times, and still miss a homonym error or an incorrect pronoun.
You should, of course, read through your own writing and do your best to find and correct errors. But when it comes to your course materials, a second pair of eyes will catch what you missed.
Remember, many proofreading errors are in prominent places, like headlines or graph titles. As important as this text is, it’s also the writing we’re most likely to skim past. Give them a second look, and don’t forget graphs, tables, and captions while you’re at it.
2. Check across multiple devices and browsers.
If you’ve built your entire course from your desktop while working in Chrome, you may be missing some display or usability errors on another device. A font which displays well in one browser may render differently in another, while an infographic which looks great on desktop may be unreadable on mobile.
Instead of trusting everything to work, go through your course yourself on a variety of devices and browsers. You can download and install the most common browsers yourself, or you can enlist some friends or colleagues to help.
The most popular desktop browsers are Chrome, Safari, Firefox, Opera, Internet Explorer, and Microsoft Edge. Many of these browsers also have mobile apps, which are separate from the native browser apps installed on iOS, Android, and Amazon phones. While some of these browsers may only account for 3% of your market share, they’re still significant enough to affect your sales.
3. Test your automated email workflows.
We’ve talked before about different automations you can do to improve your marketing and support your learners. These emails usually involve triggers, such as a failed test or a lengthy absence. Once they’re set up and working well, they’re designed not to require a lot of maintenance. After all, their whole benefit lies in removing some of your workload—not adding to it.
However, it’s easy to overthink a workflow, or create one with a series of cascading trigger events that end up overwhelming a user rather than helping them. And it is just as easy to set up a trigger incorrectly, such that it never sends the intended email at all.
To check how they’re working, use a test account to go through the course yourself to set off the triggers. Check to be sure the emails came through, and interact with the emails just as you expect your learners would. Pay attention to anything that doesn’t respond the way you thought it would.
4. Click all the buttons.
Broken link in your home page? A test submission button that doesn’t submit work? Navigation that leads to the wrong place? These are all common errors that a quality check can avoid. But, like proofreading errors in a headline, they can be easy to miss.
You might be blind to that extra CTA in your page footers, but your users won’t be (at least, not if it’s doing its job). Maybe you added it early on when you were building your course, then changed or deleted the page it linked to. Now it leads nowhere, and your learners are confused and frustrated by it.
Or maybe you added a button in your email encouraging learners to comment in the forums, but instead it leads to the course home page. Even basic function buttons, like those for online quizzes, or that advance your learners through a lesson need to be checked, unless you’ve used them enough yourself to be confident that they’re functioning correctly.
Your course will never be perfect, but a good quality check can get you close.
Try as you might, you’re almost guaranteed to miss some problems with your quality check. They important point is not that your course is error-free, but that it’s achieved a quality standard that makes it worth testing.
When it comes to your beta test, the fewer minor errors your testers must contend with, the more they can focus their attention on the more important questions. If they’re so worn down by spelling errors that they miss large-scale issues—like missing information in your lesson plan or a confusing feature—then their feedback won’t be as useful as it could have been.
So, run your quality checks first so that your testers—and eventually your users—can appreciate the best parts of your course without being bogged down by minor errors. It may take some time, but it will be well worth your effort.