The use of Chat GPT in the academic community has raised ethical concerns, so it's essential to explore this topic further. Is it ethically acceptable to use Chat GPT in academia?
This is a very difficult question. Your exact question asks if it is ethical. I think that by the standards of everything that has gone before the answer is no.
But the rules and expectations and standards have changed so much that it is so acceptable now. Let’s think back to when special shoes with spikes were considered unethical for running races - we consider to be so normal and natural now!
One of my 'jobs' at my university is to investigate plagiarism. It does not need the genius of either Sherlock Holmes or Hercule Poirot or the tenacity of Lt. Columbo. Investigating so-called AI produced essays takes no more skill than any other form of plagiarism investigation. Commissioned bespoke written essays have been around since Socrates was the model co-ordinator.
Certain dubious individuals used to advertise in the Student Law Review (of all publications) some years ago 'assistance with essays and dissertations'. The 'assistance' was writing it for you for a fee. It failed as frequently as it succeeded and where the tutors were vigilant it failed more often than not.
Any tutor knows what they have taught in class and seminar tutors in particular have usually identified the most competent students half way through the semester.
Where an essay appears too good to be true it very often is. The use of material not introduced in teaching indicates one of two things. Either a very bright student, who carries out further research beyond that taught in class or a plagiarist. The bright student you probably already know.
If plagiarism is suspected it normally takes only two or three questions to ascertain that the student does not know anything about the stuff they have included in 'their essay'. Why for instance, where student has written an essay on reverse burden of proof in evidence would they not be able to explain what it meant when asked? What would be the reason that a student who had written about return on capital employed was not able to recite it orally when asked what that term meant?
You can apply this to any specific subject you teach. Someone who 'wrote' a brilliant essay on the Roman Republic would, we would expect have some idea who the Gracchi were and if not why not?
It never fails because the reality is those who plagiarise seldom even read the essay they submit, let alone write it.
There are of course some students who are just brilliant anyway. Even those whose attendance records are not good but they are by far the exception. It is best not to over generalise but there is nevertheless a rule of thumb in plagiarism detection. AI generated essays are not immune to detection any more than those written by flesh and blood ghost writers, it's yet another academic myth.