Oh, no the scores are so low
Oh, no the scores are so low

Oh, no the scores are so low

What can we do when we think judging is too strict?

The generally low scores for our members’ entries in the October monthly evaluation of images was quite a talking point at our club meeting on Wednesday the 4th.

Maybe we were particularly gloomy and negative because of the disruptions to roads, electricity and water supply, and the cancellation of social events, all caused by the worst rainstorm in 200 years that had hit Hermanus about ten days before the meeting. But the scores were really low. For example, only 8 images received a score of 24/30 (a Gold on senior levels) or higher. That is 18% of the total of 44 images entered, which is less than half the average of 48% for the previous six monthly meetings.

The top-scoring October image did receive a decent 26/30. But that was still lower than any top score of the previous six months. So what should we make of that?

A search on the internet confirms that inconsistent scoring is not a rare phenomenon in the evaluation of images, and that judges are aware of it.

“Many judges stress the subjective nature of photography as art,” writes one Bill Katzenstein on iconicphoto.com. “Some admit to certain biases in assessing photographs, such as a dislike of utterly abstract, digitally conceived imagery.”

Being a judge for camera clubs is a humbling experience, writes JM Lobert on blogspot.com, “as everyone in the audience is expecting you to give them useful feedback, a consistent score and you are expected to be an expert of sorts, be it for bird, nature, creative, portrait, street, journalistic or any speciality kind of photography”.

All judges have their own artistic focus and cannot possibly be an expert at every specialty that the hobby provides, he argues. “On top of that, all judges have their personal likes and dislikes, which makes any feedback ultimately not objective and in many cases highly subjective.

“Camera clubs (at least in the US) tend to have that notion that nobody should be assigned a low score as to not discourage them to enter competitions. This is truly misguided, in my opinion.

“Nobody is served when everyone is considered ‘average’ or everyone is ‘good’; it is not reality. If you don’t want low scores to be assigned, why do you have them in your scale?”

The importance of scores is underlined by Donna Brok of the Northern American Nature Photography Association, on its website: “The job of a photography judge is, of course, to provide expert feedback. The first and most basic level of feedback is scoring, based on the judge’s analysis of the work.”

But scoring is only one part of feedback: “Ideally, the judge will also give a second level of feedback with helpful advice for improvement or explanations of why an image was or was not successful.” None of these commentators really touches on the issue of general low scoring by a particular judge. A contributing factor could be that a judge who usually scores out of a total of 15, as is the most common scale in South Africa, has some difficulty in adjusting to a scale where the maximum is 30, which is used by our club and some others in the Western Cape.

We may have been spoilt by our club committee’s success in finding judges from all over the country for our monthly evaluations during the past year. Generally their contribution of a variety of viewpoints, expertise and critique is like a fresh breeze. But even before the low October scores, their scoring has varied by as much as 18 percentage points. Even in the case of salons, where panels of three experts are appointed to judge each entry in order to improve consistency, the scores for a specific image can vary substantially from salon to salon. (See the example of the Nyala image.) Maybe we should just accept that that’s life – like variations in annual rainfall and in the occurrence of heavy rainstorms all over the world.

An example of how scoring can vary, even on the high level of photographic salons. This image, “Nyala youngster”, was entered for 16 different salons over a period of more than a year. Half the expert judging panels gave it a score of less than 30/45. Eventually it was accepted for exhibition by three salons, with scores of 37, 35 and 34.