1.3 List of Figures
Fig 1 An example Snap! program.
Fig 2 A typical example of BJC curriculum which includes graphical output.
Fig 3 A typical example of a CodeStudio problem that gives students only a few blocks to work with, and has a fairly constrained solution space.
Fig 4 The initial page is a list of questions to try.
Fig 5 Administrators have additional functionality.
Fig 6 Creating a new course is a simple action which requires little information.
Fig 7 A dashboard showing the first two labs the submission times for autograder requests for.
Fig 8 The initial (edX) version which had a heavily integrated feedback button.
Fig 9 Updated controls for the autograder showing a dropdown menu. (The controls for reverting submissions are greyed-out.)
Fig 10 An example of the feedback presented when everything is correct.
Fig 11 An example of feedback showing some failing cases.
Fig 12 Snap! can be embedded in edX through JSInput.
Fig 13 When a student clicks on the link, a new tab will open with the proper question they are assigned. Clicking the "Get Feedback" button triggers a submission which sends the grade back to the LMS.
Fig 14 A very basic LTI launch sequence. Image from the IMS {{ "ims-img" | cite }}.
Fig 15 Number of students by number of questions attempted
Fig 16 Lab 11: submission times by day.
Fig 17 Lab 12: submission times by day.
Fig 18 Lab 14: submission times by day.
Fig 19 Number of autograder submissions by hour of the day.
Fig 20 Number of autograder submissions by day of the week.
Fig 21 Most students appear to only submit once at the end of their work.
Fig 22 Number of students by number of times submitted for each lab.
Fig 23 Students are fairly evenly split between preferring online vs oral lab checkoffs.
Fig 24 Most students completed checkoffs alone, and found the feedback easy to interpret.
Fig 25 Most students reported completing work before using the autograder