I have to admit, I’ve been finding it hard to conjure up the enthusiasm to write about anything of late. When I thought about why this was, one of the reasons that has dampened my enthusiasm has been the yearly grind of the A-Level coursework. Although teaching is a very rewarding job at times, at other times it can be very hard to have how well YOU do your job judged on how well someone else performs in an exam or in a piece of assessed work. Someone posted this link on Twitter (I forgot who, sorry) which I found really interesting – what if we treated doctors the same way we treat teachers? According to this article’s philosophy, if doctors were treated like teachers, when you chose to eat so much junk food you got heart disease and your doctor could not resolve it, your doctor would be the one judged to be failing! However on the other hand, I suppose you could argue that it is our job to work with our “patients” every day to ensure this does not happen – although we only have a limited amount of hours to make this happen.
Anyhow, nowhere is this judgement more prevalent than with the huge requirements of the A2 coursework – both in Computing and ICT A-levels – the requirement for this piece of work is ridiculously huge if compared to that of other subjects’ A-level coursework. Not only do students have to produce pages and pages of documentation (around 100 pages for some of my A-grade projects), they also have to write a piece of software too. How is it fair to ask people to complete this amount of work when in other subjects they must simply write an essay – albeit a researched and re-drafted one? I also find the mark schemes to be an arcane science, it seems to be that the teacher has to guess what the moderator is looking for (let alone the students trying to guess!), and no amount of poring over the scant mark scheme descriptors, going on exam board courses or closely emulating the example projects can tell you what you are supposed to be presenting to gain the coveted marks.
And all that is even assuming you have a class of enthusiastic, motivated, A-grade students – what on earth does the prospect of this much work do for the lower end of students who take one look at it and immediately think they can’t do it and give up? I read an interesting article the other day which I now can’t find (curses!) about how programming exposes students to regular failure, which in our culture of ensuring success is not something they encounter often at school. Those who give up easily tend not to do as well in other academic areas, whereas those who show resilience and try to solve their own problems will generally do well elsewhere too. This school might indeed have the right idea. (Yet another cross-curricular boost from teaching Computing to tell your SLT about!)
So, I was thinking about how this kind of work could be better examined, and it’s a tough one.
I firstly thought about the coursework set at university in Computer Science – there was a lot of group work which had good and bad points. Good, because it enabled you to work with a mixture of people, to learn from them, and for the project to be more realistic – I think it’s fairly unlikely that in industry you would have to work on a project on your own. Bad because it meant some people didn’t do enough work and it was difficult to assess the individual’s contribution in a fair way.
Then I thought about doing away with coursework altogether – which would be rather silly for Computing to not have a practical element. It would probably result in more AQA style programming exams, which I am again not a fan of. I find that programming in an environment where you are unable to look anything up or use other resources to help you is very artificial – how many programmers remember the argument order of every single pre-written function in a library? Certainly not me.
Then I thought about just submitting source code without all of the analysis and other huge rainforest-killing documentation that has to go with it. Again, not a great idea because this makes for dreadfully badly thought out programs that don’t meet user requirements and are probably way too large in scope to be implemented. (Probably good for anyone who wants a career in programming for the public sector though! Ho ho ho.)
So in other words, like one of my other lecturers used to say, I don’t have the “silver bullet” of how to fairly examine Computing coursework. However, with the huge and really excellent grass roots movement that is sweeping up the new Computing curriculum, I hope this is something that teachers and industry professionals can offer some advice on, and that the exam boards may listen to and improve. For now, we’re stuck trying to encourage our students to produce programs and huge amounts of documentation through all of the usual stick and carrot methods available in school. Some good advice:
@learningdomain – “start early”
@bringbackcs “I say: Have you checked on-line guides? Have you worked through all tutorials? No?!! Let me know when you have.”
… but it still feels like more of an achievement for the teacher having survived with everyone finished, than a test of the student’s software engineering capabilities.
What kind of documentation and analysis do you expect to go along with the code submitted? I don’t understand why code couldn’t the be the single asset required for an assignment. It’s not like other subjects where you can hide your work and just show the result. The entire work is right there in code form.
At present you have to perform a requirements analysis, draw up a complete set of paper designs, process diagrams, ER diagrams (if necessary) etc. and accompanying write up. Then once you’ve made it you have to provide a full test plan, with screenshot evidence of tests, a full manual, and an evaluation of whether the requirements were met.
That’s your problem there, that isn’t programming, that is something else. Programming is the part of taking a set of requirements and writing the code to implement it. Now, ideally, early in programming, these requirements will be fairly limited, perhaps just an algorithm.
Designing software, which means coming up with requirements, is not programming. This is a distinct field and usually has a different person doing it. Quite often these people are not programmers. While certainly a programmer should be invovled in the process, the high-level design process is not required to do programming.
ER diagram is also not required to do programming. If you are intended on storing data in a database you will need such a diagram, but for coding the software you won’t. Often you may use adhoc class diagrams before writing your code, but this are almost always informal and the code itself will always show the final results.
I had such a class once before as well. But in terms of industry this approach is entirely misguided if your are trying to teach programming. Indeed, even if you are trying to teach a full software process (which is not the same as programming) it is beyond the scope of a single class. Just the aspect of requirements analysis would take at least one entire course.
Did you ever remember the article you referred to about failure?