Beginnings

 

From the studied literature I was particularly interested in the findings of Hoskins and Newstead (2009) on student motivation in higher education. They found that students can become amotivated, defined as an absence of motivation (Ryan and Deci, 2000), sometimes as a result of learning experiences. These students may not know why they are on their current course of study, may feel incompetent or may feel that their learning is out of their control. They may have lost motivation due to poor feedback, high workload or a perception of irrelevance of their learning to their personal goals.

 

This project was run at a time when students traditionally have a high workload. There is also a general view (anecdotal) within the target student population that feedback on their work throughout the course has been mixed and, in some parts, very poor. There has been a noticeable effect on student motivation resulting from delays in giving feedback on assessed work and this theme permeates through initial discussions and survey results.

 

The first student survey (see right) questioned students' motivation for joining the course, methods of working on assessment material, use of feedback and aspirations towards achieving high grades.  

A student who joined the course as part of a deliberately chosen career path may be more inclined to view employer involvement as more vocationally relevant and, more motivational (Hoskins and Newstead, 2009). A local web and software developer, who has worked on some big projects and was keen to be involved, delivered a workshop session for students to help them to learn WordPress and to help them to learn about the process of developing a website from client requirements to initial prototypes. A second web designer ran a 'Wordpress clinic' to help students to complete their websites with support for solving technical problems that they were experiencing. Bursary funding allowed me to set up the workshops as paid sessions and to rent some office space to give a more real world feel to the process. Meetings with the client for the website added to the real world experience.

 

Analysis of student hand in rates was carried out before the project and compared with data on hand in rates collected during the project. Students opinions on their motivation levels were sought both before and after the project through questionnaires and focus groups.

 

The sample population

 

The sample population was the whole population of two groups of BTEC Level 3 Extended/Diploma in IT students. Both groups have been taught by me for two units. Unit 6, Software Design and Development, was delivered and assessed in the first half of the 2013-2014 academic year. For their assessment students were expected to build a portfolio and also to complete a final project which incorporated all the skills covered by the unit. The portfolio was built throughout the unit with work set regularly as homework. Students were given an 'achieved' or 'not achieved' result and then had the opportunity to return to that piece of homework and to improve it if they wished to do so. Skills and knowledge were assessed in a number of homework tasks so that a student did not need to return to all tasks in order to build their portfolio but could be assessed on a 'best piece of work' for each assessment criteria. Unit 28, Website Development, was delivered and assessed in the second half of the 2013-2014 academic year. For assessment of the web development unit, two assignments were delivered in the first stage of the unit. These were delivered in a way that matched the other units that students have taken on the BTEC Level 3 Extended Diploma/Diploma in IT, this being that the assignments were designed to assess named criteria, were set a first deadline and then a second deadline for submission of an improved version, which all students expect to make unless they achieve a pass on the first submission, when some choose not to resubmit.

 

A third group also took both units with me but this group started both units at the beginning of the academic year, with the website development unit continuing throughout the year. This group were already working on a different project. To be part of this research project, the students needed to be in one of the groups studying Website Development only in the second half of the year.

 

The opportunistic sample of two groups created the total population and every student in these two groups was selected. A total of 27 students were registered on the course at the start of the project. One student, however, suffered some personal issues meaning that he did not attend the college during the life of the project and therefore did not participate and is not included in the sample. The total number of students selected for the sample population is 26, 11 in one group and 16 in the other. During the life of the project, two students left the college, one to take up an apprenticeship. Final data analysis, using data collected after the intervention, therefore, relates to a slightly smaller sample population of 24.

 

Empirical deadline meeting data

 

Data relating to students meeting deadlines was collected from my own records, verified by checking the online upload records. When students submit work for marking they will upload it to an area on the college's Virtual Learning Environment (VLE), Moodle which has been set up for this purpose. There is one area for each assignment or task. When tasks/assignments are assessed, the outcome is recorded on a spreadsheet by the assessor (in this case myself). A comment is recorded in the assignment area on Moodle. If a second attempt is made then I would normally record the date in my feedback.

 

During the delivery of the two units, twelve 'homework' tasks were set, one final project for Unit 6 and two assignments for Unit 28. These 15 pieces of work, although of differing sizes, difficulties and topics, were recorded simply as being handed in on time, resubmitted with improvements, handed in late or not handed in at all. This is a simplification of the process and doesn't take into account students who struggled with gaining the skills or understanding, students who may have experienced external difficulties that affected their ability to submit work or students who may not have made a significant effort on first submission of their work, choosing to gain an advantage by using the feedback given by the assessor. This data is not available for analysis.

 

Data relating to student submission of work was encoded as shown in Once each piece of work had been encoded, a summary assessment of the performance of each student was made. This was done by applying the weighting to the data relating to each piece of work and calculating the sum of the weightings.

 

From this collated data a set of five categories were calculated. Each student's performance across 15 tasks (12 homework tasks, two assignment tasks and one final project task) was assessed by adding the weighted values for each task from the table above. shows how each student was placed into a category according to their final score.

 

shows the distribution of students across the five categories. The data indicates that over half of the sample population (n=14) consistently met deadlines. One third (n=9) often or sometimes missed deadlines. Four students, (15%) consistently missed deadlines.

 

Why did students take this course?

 

Students were asked to complete an online questionnaire:

A good proportion of students (45%, n=9) joined the course believing that it would help them to get a job in the IT industry. One student joined believing that it would help him/her to get a well paid job. The question did not specify whether or not the well paid job would be in the IT industry but there is anecdotal evidence to suggest that student perceive the IT industry to be generally well paid. A similar proportion (40%, n=8) joined the course for what might be classified as more intrinsic reasons relating to learning new skills or learning as much as they could about the IT industry.

 

An analysis of the responses grouped by relating reasons for joining the course to factors relating to working specifically in the IT industry indicates that 65% (n=13) have a strong interest in working in the IT industry. No analysis of the perceptions of students relating to the IT industry or what it is was made. Only one student indicated that the course is a vehicle for progression to university education. Of the students who participated in this survey, four have applied for a place at university but only one cited this as a reason for joining the course.

 
Reasons for joining course

How do students approach assignments

 

Most students stated that they start to work on an assignment immediately (55%, n=11). Four students (20%) stated that they organised each assignment around others they had been set. One quarter (n=5) of the participant population reported that they left assignments until the deadline was close. A common complaint made by students is that a cut-off time of 5.50pm on an assignment hand in day is too early and that they need the evening of a deadline date to make sure that they finish the work in time. Of those who said they started an assignment immediately, two students qualified this by saying that they "start it when given, finish near deadline" or that they "make a start to an assignment as soon as possible, but work on others". Responses help to reinforce the problem that students have in organising their time when they have a heavy assignment workload. Although staff make every effort to ensure that assignment deadlines are evenly spread, these deadlines overlap and this gives the impression that there is a lot of work to be completed at once, resulting in amotivation in some students who may then not meet any deadline at all.

 
How students approach assignments

Do students meet deadlines?

 

Interestingly, no student answered that they rarely missed deadlines. The empirical data collected in relation to submission of work by the deadline indicates that three students from those who responded to the survey consistently missed deadlines and therefore rarely met them. These students answered that they sometimes met deadlines and so there may have been a difference in interpretation arising from the positive and negative connotations of the words "sometimes" and "rarely". No attempt was made to quantify these terms for students responding to the survey and so it is possible that these students did have an accurate concept of their hand in rate but interpreted the wording in an unexpected way. On verification of empirical data relating to meeting deadlines, other staff members rated students more favourably and this is probably due to the different ways that staff deal with students. One member of staff gets students to complete all assignment work in class and lets them leave the class once the work is in and checked. I tend to use class time for learning and to set assignment work for students to do in their study time. This is bound, then, to affect the rate of deadline meeting but may affect a student's perception of how often they meet deadlines.

 

Students perceptions of how often they meet deadlines differed from mine in nine students who all assessed their deadline meeting rate as higher than my assessment. I had given a rating of 3 to one of these students but the other eight students had ratings of 4 or 5. No student with a rating of 4 or 5 who completed the questionnaire rated their deadline meeting the same or lower than mine.

Meeting deadlines chart