By Thomas Garbelotti on December 10, 2017
One of the most beneficial, powerful, and effective functions of UCLA’s Common Collaboration and Learning Environment (CCLE) is the minimally named Quiz tool.
Far more than a way to create quizzes, the Quiz tool allows instructors to create a variety of low- or high-stakes assessments — providing a clear evaluation on student learning (both at the micro and macro level), minimizing the grading workload, and freeing up instructional time by moving tests and quizzes out of the classroom. Students share in these benefits, both by receiving instant feedback, and by being able to better balance their own schedules and workloads with the flexibility of taking an online test.
Creating assessments in CCLE can seem daunting and complex. Creating the content does take some work, but together with instructors we’ve developed a scalable and sustainable schema that works. Successfully used by an increasing number of language departments at UCLA, CCLE Quiz has enabled instructors to offer over 30,000 high-stakes assessments in the last academic year in combination with our online proctoring software (Respondus).
A tiered approach
Our approach is to build assessments as a package, rather than just individual quizzes or tests. Any given quiz activity is mapped to a discrete learning objective, allowing students to work only on that objective in a formative manner. But each quiz activity is also designed to fit into a larger course assessment package; it is made to combine with other quiz activities into a larger overall assessment. As a result, instructors are able to deliver a summative testing event (e.g. quizzes, midterm, or final) built upon already existing quiz content.
The steps include:
- Determine the learning objectives that are to be assessed.
- Determine the category (or categories) that will support each.
- For each category, create the questions that will be used to assess the learner.
- Create an assessment wireframe using the Quiz tool. This can include:
- Individual formative assessments per learning objective (as an activity);
- Grouped assessments as a median summative exercise (as a quiz);
- Tiered summative assessments (as a midterm or final).
As a simple example, consider the approach to testing knowledge of a defined vocabulary in a foreign-language course. In the schema we have developed, a category such as the following is created, and a number of questions (say 10) are added to it:
lang-1-week-1-vocab
During the first week of instruction, a simple formative activity (using the Quiz tool) is created, where the students are provided 5 random questions for their practice. At this point, the assessment is designed as a simple learning exercise, so students are allowed unlimited attempts at the quiz until they are satisfied with their understanding of the material and their quiz results. By viewing the grade distribution for this discrete activity, the instructor can quickly determine if the class’s progress on this objective is as expected, or if more explanation is warranted.
Repeated on a weekly basis over the quarter, we end up with multiple categories (lang-1-week-2-vocab, lang-1-week-3-vocab, etc.), each of which is not only useful as its own activity, but can be combined in a cumulative quiz or midterm. For that larger assessment, questions from each category are resurfaced in a randomized format, requiring no more content effort by the instructor. These can also be combined with questions from other quiz categories (e.g., grammar, reading comprehension, etc.) to deliver a comprehensive, even high-stakes assessment simply by randomizing questions pulled from across multiple quiz question “banks”.
The net result of this schema is a simple, re-usable assessment resource within CCLE. No further updating is needed for future use unless you want to change the testing structure to meet new or different learning objectives. Because all of the content work is done within the Quiz question bank, the instructor can flexibly and easily add, remove, or change questions without changing the framework of any of the assessments you’ve already created.
Scalability and future proofing
Content creation is a lot of effort, and can seem overwhelming. This tiered schema to assessment design provides a way to start with a minimally viable set of quiz activities that the instructor can build incrementally, as time and inspiration allow. With a framework built around the question categories, you can start with a minimal number of questions in each and simply add as time and effort permits. Even just adding two questions per term, your question bank will build quickly over time.
Furthermore, by having your content and questions carefully organized into defined categories, transferring them into a new tool or system is much easier, and they can be saved offline as well.
Ease of creation
Recognizing that creating good and effective questions in CCLE takes effort, and working in tandem with instructors, we’ve found this tiered approach to make quiz creation as easy and as fast as possible.
We’ve also refined the workflow to further ease of creation. By using a google document, or a spreadsheet (the latter thanks to a custom plug-in developed here at HumTech), we’re able to batch import your questions, tens or hundreds at a time, into CCLE. This not only allows you to easily see what you’ve created, but saves hours of time and hundreds of clicks from creating questions one at time.
This spreadsheet plug-in is limited to the following question types for batch import.
- Cloze
- Essay
- Multichoice
- Ordering
- Short answer
- True and false
- Voice Recording
We strongly recommend a quick consultation and demo of this in action before you begin. Please reach out out to us at ritc@humnet.ucla.edu to arrange a time to meet.
For more information on this, the quiz tool, and Respondus, please see our instructional support pages:
Photo courtesy of By KF [Public domain], via Wikimedia Commons.