Skip to Main Content

Using AI Tools

There are many ways that using Artificial Intelligence can aid you in your schoolwork or at your job. It is important to use AI tools ethically and responsibly while you are a student at NBCC. AI should be used to improve your understanding of a subject, instead of as a shortcut to completing your assignments.

When can I use AI?

Before using AI for your coursework, you are required to check in with your instructor and discuss how and when you have permission to use an app or online tool.

Students have had success using AI tools to:

  • Further understand a topic discussed in class
  • Help them create a study schedule
  • Translate text into a different language
  • Quiz them on a topic in preparation for an exam
  • Brainstorm ideas on a topic
  • Get feedback on their writing


Citing AI

If you have permission to use an AI tool in an assignment, you still need to cite it as a source you used (just like you would cite any other source). For example, when including information from ChatGPT in an assignment done in APA format include an in-text citation after the information:

"ChatGPT is a computer program that uses advanced language understanding to have text-based conversations and provide information or assistance on a wide range of topics" (OpenAI, 2023). 

As well as an entry in your reference page:

OpenAI. (2023). ChatGPT (Aug 3 version) [Large language model].

For more information on when and how to cite sources in your assignments, check out our Plagiarism and Citation Guide.


When shouldn’t I use AI?

Students can not use AI when they don't have their instructor’s permission for a specific use and tool. AI tools should not be used to plagiarize; you shouldn't have AI write part or all of an assignment for you. 

AI tools like ChatGPT are often not factually reliable. Using AI for classwork or assignments can result in getting incorrect information. While ChatGPT can be helpful for brainstorming, the information it gives you needs to be fact-checked carefully. 

Is AI Reliable?

No, not always!

Large Language Models, like ChatGPT, analyze an enormous amount of data and can generate text by noticing patterns in what words and phrases are commonly used together. By doing this, they appear to have "learned" language. It's important to understand that when you are using ChatGPT, it is not searching the internet for a correct or commonly found answer. It is generating new text or content based on the prompt it is given.

Fake Citations

It has been widely reported that ChatGPT will make up citations or sources when prompted. These fake citations may look like scholarly sources, but the articles, journals or books referenced may not exist at all.