
KARACHI:
AI has become an integral part of our lives, and we have become increasingly dependent on it. Gone are the times when researchers would spend a month crafting an introduction or compiling an annotated bibliography. There was a time, during clinical rotations, when we would encounter a case, go home, read a book and then Google it. But now, AI tools, including ChatGPT, SciSpace and Meta AI, have become our comfort zone. It’s like magic, fulfilling every request, from writing research papers to meeting tight assignment deadlines. No doubt, it’s a blessing.
However, like every blessing, there is a downside. While AI has provided comfort, it has also diminished the need for human effort and cognitive engagement. The language used in messages, emails and research papers generated by ChatGPT and other tools often reflects a limited vocabulary, making it apparent to readers that the work was not written by a human. Words like “crucial”, “comprehensive”, “novel”, “ensure”, “earlier”, “tailor” and many more are so commonly used that they make it obvious that the writing is AI-generated.
Not only this we have also encountered that medical professionals are following it for treating patients, but they need to understand that AI just provide generalised knowledge, while specific knowledge is always in researching and reading books. As medical professionals and researchers, we believe that while it’s fine to take ideas from AI tools like ChatGPT for direction, the actual writing should be done by you. AI can never replicate the depth of understanding and knowledge of the human brain. So, use your intellect, take ideas from AI, but always write in your own words to make your work unique; and never forget to read researches and books for enhancing clinical knowledge.
Faizan Saeed Syed and Nida Rizvi
New York