Two year olds and ice cream

As a creator of online education for emergency medical services, I am routinely tasked with creating one hour modules focused on specific topics.  Over the past two years, I have built lessons focusing on endocrine emergencies, TRIAGE tactics, and infectious disease.  The module that has caused the most problems though is a course I built using another educators content focusing on medical math.

The goal of the course was to create a module that would review relevant math concepts including medication administration, drip rates, and measurement conversions.  The content provided by the other instructor was thorough and focused with good objectives providing a ready means for learner evaluation.  This course was my first using the Articulate Storyline software which worked well and allowed for a great deal of design flexibility. The course was completed on time and within allowed resources.

So what was the problem?  I never considered how the user would navigate the course and fill in the quiz questions.  While there were some successful completions, most of the student reviews were disappointing with frustration at how the answers were graded.  You see, I used fill in the blank questions which require students to answer in a specific way.  I did not anticipate how many issues would be created by this decision.  One student identified the course as “…similar to watching a 2 year old eat ice cream, it’s messy, but eventually gets the job done”.  Sigh.

Since the course was first published, I have taken it off line three times to perform various re-writes and adjustments.  Each time the course improves, but it keeps having issues and is consistently our lowest rated module.  What could I have done different?  I believe being more diligent about identifying the users and what their capabilities are would have been a great first effort.  I have come to understand, based on this experience, that my implementation strategy was not effective.  I should have tested the course more thoroughly among the target user group.  This would have identified how the answers should have been formatted and avoided a great deal of repeat work and frustration.  I also should have spent more time in Step 3, “…document, in high resolution, everything you are going to be building” (Greer, 2010).  By having a clear understanding how the finished product should work and look, I could have avoided many issues.

In retrospect, I learned a great deal from building and maintaining this course.  I have since been much more diligent about meeting the user’s needs and ensuring the course is clearly defined.



Greer, M. (2010).  The project management minimalist:  Joust enough PM to rock your projects.  Laureate special edition.  Retrieved form


9 thoughts on “Two year olds and ice cream

  1. genahloger


    It’s great that you have been able to learn from this experience. It’s really difficult to be able to think of every last detail from the students’ perspective, but you are right in saying that a proper beta test with the target users and clearly documenting everything you were going to be building would have helped in the beginning. I hope the continued revisions will help you get closer to where you want the module to be! Thanks for sharing your experience, it reminded me of the fact of how important beta testing really is before making a training product go live. Thanks.


  2. LorenSpinks

    Hi Don! Thanks for your post. I enjoyed reading about your project. I’m glad that you were able to take it all in stride and make so many revisions to improve the course. No course is perfect the first time, even when it goes through beta. Evaluation and an open mind are so important. It sounds like your students’ needs are very important to you, and I think that will continue to take you great places with your work! – Loren

  3. lucasmontrice

    Hi Don,
    I enjoyed reading about your project. I think you have an amazing job of being able to help ems learners train to be effective and refresh their memory of important information. I think it is great that you have evaluated and thought of ways that you could have avoided the problems that have arised. I hope that you are able to successfully improve this course completely. Maybe you could get an overall analysis or survey done of all the issues from the learners and get feedback from the professors of things they think would make this program exemplary.


  4. Kelley K.

    Hi Don, Thanks for your post. I also teach online courses and as the educator I have felt so bad when the students are frustrated with the flow of the online set up (whether it be content, discussions or testing methods). You took their concerns and feedback seriously and I believe a good educator will always do this. Being the online student has helped me be a better online educator.

  5. Stephanie DeVee

    Hi Don,
    Loved your post. I am a teacher so I love following others that have a different background than I do. First of all I must say I loved your title, when deciding who’s posts I would read this week yours definitely caught my attention. I was wondering if you were working with anyone else when creating your module that seemed to give you some trouble? It just made me think back to a resource I was using for my own blog post that talked about how important it was to form the right team. It sounds like you are doing a great job with what seems like a large project, and I just wonder if there are others around to help you through the process. Murphy talked about the different roles in a project and the one that jumped out to me was the role of an end user.(Murphy, 1994). It seemed that this was your missing link, someone to test out the program for you before you posted it for others to use. I know for me when working on anything myself it is hard to catch my own oversights and it is usually beneficial to have an outsider check it out to see if it makes sense to them. Either way I am impressed by your line of work and your willingness to keep at it until it works for all your users. I know too many situations where once it is done the creator doesn’t bother to make any adjustments and instead expects the learner to do so! Well done.
    Murphy, C. (1994). Utilizing project management techniques in the design of instructional materials. Performance & Instruction, 33(3), 9–11.

  6. Amanda Sutliff

    Hi Don,

    I have actually pondered the use of short answer response questions in online learning before. It seems like a good idea at first, because the student needs to do more thinking rather than just choose from a list of multiple choices. But how do you grade short answer responses? I use Edmodo in my classroom, and you can create short answer quiz questions. Edmodo will automatically grade multiple choice and true or false, but the teacher must manually read and grade the short answer and essay questions in Edmodo. This is okay though, because most of the time students’ answers are so different, but all of their responses are generally on the right track, and should receive full points.

    Is it possible with the program you used to grade students manually, so that they wouldn’t need to answer the question with the correct answer word-for-word? Instead, you could go in and analyze the answers they give on an individual basis and decide if they have the right idea, even if they didn’t say it exactly perfectly. That would be a lot of extra work for the course facilitator though, but it would avoid allowing the computer to automatically score quizzes and take off points when students actually understand the material. I’m curious what kind of revisions you have made to the course. Good job with persevering and making your project work in the end.


    1. dstroup461 Post author

      Hello Amanda,
      I have considered incorporating questions where I would go in and grade their responses (it would be so nice to get away from multiple choice and true/false!) but the courses are asynchronous averaging about 300 contacts a month so the effort would be very difficult to say the least. I have used fill in the blank questions on some of my smaller hybrid classes thought with great effect but, you are right, it takes some work to make them successful.

      The software I am using is Articulate Storyline (love it!). The fill in the blank questions allows you to create up to 10 alternative answers in order to cover any version of the correct answer. It works pretty well with words, but when you incorporate numbers and abbreviations, the whole thing falls apart. Lesson learned!

      Thanks for you comment.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s