Configuring test questions282 0 5
Hello, I am having some problems setting up test questions to function the way I want them to in Claro. Here is what my client wants:
The learner answers all questions in the test. There should be no feedback. Just select an answer, click Submit and move to the next question. At the end, if the passing score is achieved, the correct answers for all questions can be shown. The learner is given two attempts to pass. If they fail the first one, they can go back and take it again. If they fail again, they have no more attempts and must take the course over.
I have created the test by adding questions using the "As Question" option. However, when I run the preview and take the test, it is marking the questions with a check mark or an X, for right or wrong, immediately. Then I have to click the Submit button a second time to advance to the next question because the "Next" navigation arrow is disabled. As all the options on the Question tab are greyed out, I can't configure anything.
I am not sure how to configure the test to function in the way described above. Is this possible in Claro?
Comments( 0 )
Answers ( 5 )
NOTE: I've edited this answer since posting it
Hi Tim -
Most of the functionailty you are looking for is controlled by the Publishing Profile, which has settings for test behaviour as well as many other options. So you'll want to create a new publishing profile with the behaviours you want. Here's a link to more info on the Publishing Profiles over all. (A Publishing Profile can apply to either Claro or Flow projects, there is no functional difference.)
Here are some of the specific settings to work with based on what you've identified above.
There should be no feedback. Just select an answer, click Submit and move to the next question.
The Publishing Profile has a setting for whether or not the learner should see feedback after submitting for a question. In a Claro project, you'll see that the Submit button will now auto-advance to the next question. In a Flow Project, though, the Submit button will still turn to Next (more on this below).
At the end, if the passing score is achieved, the correct answers for all questions can be shown.
The Publishing Profile has settings for allowing the learner to review the test questions, including only reviewing if they have passed.
The learner is given two attempts to pass. If they fail the first one, they can go back and take it again. If they fail again, they have no more attempts and must take the course over.
The Publishing Profile has a setting for allowing re-attempts if failed plus setting the # of re-attempts. (If two attempts are wanted in total, set that value to 1). The tracking that they've taken the course over and can then take the test again will be an LMS functionality, not an inherent course function. Every LMS will handle this a bit differently.
Submit button in Flow: As I noted, the default Submit button behaviour in Flow is to turn to a Next button after the learner submits. You can turn off the default Submit button for each question, though, by selecting the Disable Submit check box on the Question tab. You can then add a Submit button of your own - there's one available on the Insert tab in the Buttons section. If you add that Submit button to the page, it is already pre-programmed to do the submit question function but it won't turn into a Next button. The behaviour at this point would be that the learner submits, then uses to Next button on the nav bar to go to the next question.
You can further tweak this by adding a second action to the custom Submit button to go to the next page automatically. To do this:
- Select the custom-added Submit button
- On the Interact tab select Actions
- Select Navigation and Branching
- Select Next Page
- In the lower right corner of the panel, select Next (the right-side panel will open)
- On the right-side panel the action will be all set up for you and you can just select Apply. One thing I'd suggest, though, is to consider setting the Timer to 1.0 or maybe 2.0. to have a slight delay between when the learner clicks Submit and the page advances. I'd be worried that having it go to the next page immediately might surprise learners/it could seem abrupt.
(You'll need to do this for each question.)
Hope this helps - let us know if you have further questions as you work through this!
Comments( 0 )
I am still having an issue with two of the tests. The course has a total of eight tests in each of the English and French versions. All of the English tests work fine, and six of the eight French tests work fine. By that, I mean that when the learners clicks the response button (True or False / Vrai or Faux) they automatically advance to the next question. At the end they see their total score. But in two of them, when they click the Submit button, an X or a checkmark appears, depending on if they got the answer right or wrong. Then they have to click again to advance to the next question. What settings would I need to look at to fix this? Since they were all set up the same way, I am at a loss as to why these two are functioning differently.
Comments( 0 )
Both the English and French versions of the course are using the same Publishing profile. Each course contains 8 tests, two of which function differently from the others in the French course. There must be something specific to the configuration of those two tests, which I can't determine.
Comments( 0 )
Ah, I misread and thought you had eight seperate courses with tests for each language vs just one for each language.
I'll ask our support team to take a peek at the content in question to see what might be different. They'll reach out to you for course name details, etc.