Abstract
Providing timely and consistent marking with clear feedback to students is a common struggle, especially when dealing with large class sizes and tight deadlines for progression. One approach that can be employed in some domains is the use of automated assessment, whereby marking is performed by software usually checking submissions against a sample solution. Although not uncommonly used in computer science the tools are generally exclusively applied by the marking staff after submission. Another option would be to provide access in some form to the automated tools to students before submission, providing instantaneous formative feedback using some or all of the same criteria with which their summative marks will be assigned.
To this end in 2018 a first-year module project was set to c. 400 learners to implement a database. The intention had always been to automatically mark this work using tools (a bot) comparing against a specimen that met all the requirements, but in this case students were also provided early access to the bot. This allowed students to upload their proposed solutions and run these against a subset of the real marking tests (not all tests were included but there was good coverage of the types and areas to be tested) providing detailed output and a score of marks awarded against the theoretical maximum for their submission. During the 33 days of the project, 6231 submissions were made to the bot for which feedback was given. A survey of the students found extremely positive feedback and that those that used the tool found it invaluable in helping them to understand what was required and errors in their submissions. Providing this facility also helped lower the number of queries coming to academic staff and seemed to promote student problem-solving abilities, also helping ensure adherence to the required format.
To this end in 2018 a first-year module project was set to c. 400 learners to implement a database. The intention had always been to automatically mark this work using tools (a bot) comparing against a specimen that met all the requirements, but in this case students were also provided early access to the bot. This allowed students to upload their proposed solutions and run these against a subset of the real marking tests (not all tests were included but there was good coverage of the types and areas to be tested) providing detailed output and a score of marks awarded against the theoretical maximum for their submission. During the 33 days of the project, 6231 submissions were made to the bot for which feedback was given. A survey of the students found extremely positive feedback and that those that used the tool found it invaluable in helping them to understand what was required and errors in their submissions. Providing this facility also helped lower the number of queries coming to academic staff and seemed to promote student problem-solving abilities, also helping ensure adherence to the required format.
Original language | English |
---|---|
Publication status | Published - May 2018 |
Event | Irish Learning Technology Association Education Technology Conference 2018: ILTA EdTech 2018 - Carlow Institute of Information Technology, Carlow, Ireland Duration: 31 May 2018 → … http://ilta.ie/project/edtech2018/ |
Conference
Conference | Irish Learning Technology Association Education Technology Conference 2018 |
---|---|
Abbreviated title | EdTech 2018 |
Country/Territory | Ireland |
City | Carlow |
Period | 31/05/2018 → … |
Internet address |