Ask a silly question, get a non-standard answer.

For many years I have been aware that educators use plagiarism detection software (Turnitin) to discourage students’ plagiarism, but today I discovered that students have begun to use paraphrasing software that may or may not be specifically configured to defeat plagiarism checking software (article). The paraphrased writing (pasting) produced reads strangely enough to be fairly easily recognisable as not normal (although the same could be said for some students’ authentically composed work).

40979_2016_13_Fig1_HTML

Growing use of refined versions of these tools could conceivably lock educators and students into an arms race of text analysis and obfuscation.

When I was teaching I experienced a similar-ish situation. Students’ assignments contained many, many ‘ready-made’ answers to questions; answers easily found using search engines (and often by using the questions verbatim as search terms).

My first response to these assignments was to not accept these answers as assignment content, but this response was seriously limited by the proviso that information from websites was permissible if it was referenced. Most of the students quite soon became rather good at correctly referencing content that they used. The students had discovered that the external rules for the delivery of the qualifications they were studying for did not make any clear pronouncements on proportions of original content permissible (nor permit centres to make their own strict rules- that would create standardisation problems). Presumably the students must have carefully read through these (very dry) regulations, or at least vaguely known of someone who had done so. Assignments that consisted almost entirely of referenced quotes, linked by a few throwaway extra lines were common, with different students’ assignments only differing in those few throwaway lines.

The next round of the battle involved my ruling that the first submitter of an internet source had the right to use it (they had at least found it themselves- probably- rather than having been sent it by someone else) but subsequent submissions of the same source were however considered plagiarisms of the first submission. This policy worked as intended until most students learned that different answers to the same question were available online in abundance and that these answers were sufficiently generic that many paraphrased versions of pretty much any given answer existed and hence a class of students could all find non-identical versions of answers with a small degree of effort. This did not imply that most students recognised the common aspects and elements of comparable, generic answers in an analytical sense- only through instrumental, combinatorial eliminations (repeated feedback elicitation).

Gradually I came to understand that the instrumental ability to find generic answers was only effective for standard questions, the sort of questions that very clearly and directly addressed learning criteria and standards from qualification and curriculum specifications. If non-standard, idiosyncratic questions (silly questions?) were asked, questions which had the same underlying meanings as standard questions but were phrased differently or appeared in specific contexts, then search engines were poor at finding ready-made answers (responding more to the phrasing or context than question meaning). Developing this strategy I came to realise that the limitations of Google’s search algorithms made it possible to include terminological combinations in questions that ‘spoofed’ the search algorithms into returning content which nominally matched the questions but in semantically inappropriate ways.

Carefully using Advanced Search or (better still) thinking about what a question actually meant carefully enough to rephrase it in various ways could go a long way to guiding students towards ready-made answers that would be useful to them. Generally students were much less successful at doing that than they had been at bypassing my earlier strategies to get them to produce original work. Many students seemed to genuinely find it hard to accept that search engines could fail to deliver required information that was nominally searched for, nor that deliberative thinking was an important consideration for search engine use.