Module 3, Level 6 Reflection - Working with AI

Plagiarism and academic dishonesty - I am confused about why AI detectors are used if they are unreliable. Do they not work with the same AI agents/data that makes the content in the first place? This is counter to my own experiences with AI detectors. I check the AI when it seems off or fishy and for the most part, it’s dead on and I work with a LOT of ELL. I read the article about how they grade based on “perplexity” but the few I have worked with are more likely to catch content that is high in perplexity than low in perplexity. It’s the low perplexity that I had to confirm it was AI-generated and copied. The article was a year old so maybe the checking I used was newer or of a different type but every time I checked it, it was a true positive. I did not get false positives, or at least none I caught. This could be confirmation bias but I hear similarly from English coworkers.
As for minimizing this discrepancy I will continue to check the AIs conclusion.

Plagiarism and academic dishonesty is a real concern with AI. Thinking and making connections takes time. Even before AI, I would have students who would copy and paste code from Stack Overflow. I would have to explain to them that it is the process not the product that I am interested in it. The struggle and trial and error is where the real learning takes place. The easier it is to find an acceptable answer means the less the struggle is valued by students. In addition when they see other students cheat and get an A then is makes them value the process even less. Ways I try to avoid this is giving them plenty of class time to do the problem solving in class- so there is less temptation to get an easy answer. In addition, students who finish work early are given more challenging problems to solve instead of free time- this way there is less incentive to cheat to get the answer quickly.

I chose plagiarism because I feel like that is a hot topic in my school. I found it interesting that the software to detect is not as accurate as hoped. I think a key for this showing a brainstorming process. Turning in an outline or some sort of thinking map is the first step. As well as the obvious conversations of your writing will not match the AI version. I think kids have to be taught and retaught copyright laws and what plagarism really is. Finally consequences of the choice to cheat need to be clear.

My concern was the over-reliance and loss of critical thinking with the increase of AI. I feel that sometimes we rely on AI or technology that when it isn’t working, we are not able to do the skills we once knew how to do. For example, cashiers knowing how to give change without the cash register. Editing, spelling, and writing without AI, spell check and/or autocorrect. For me as a teacher, I will use AI for lesson plan ideas but will always look at it critically to make sure it is covering the standard to ensure that I am constantly unwrapping the standard instead of just grabbing a lesson and teaching it as is. When I mentor fellow teachers, I will continue to stress the importance of this to make sure that they are always look at the material with a critical eye. As for students, I can choose AI activities that generate critical thinking skills like the games on code.org (I just giggled cause I remembered that I am answering in here as well as my RPDP class :-D) that really do promote critical thinking skills. I have played the games at the kindergarten level and I think they are great activities to teach my kids perseverance and critical thinking skills. Especially the activities outside of the computer, like creating a path with arrows, and then going into the computer game.

I think to help combat overreliance and loss of critical thinking students have to be provided more opportunities during class time to engage in activities that allow for critical thinking. If the only time they’re being asked to think critically is if they’re answering a question or two or writing a short paper and they’re doing it on their own with no interaction with the teacher or classmates, then there are a lot of missed opportunities. There should be questions asked aloud (to check for understanding), activities, and tasks scaffolded during class time to still allow for students to think critically without having access to technology/AI tools. It should be embedded as a regular part of the learning process.

Plagiarism and academic dishonesty are things that most educators are concerned about when we talk about students using AI. The negative aspect is that students and teachers don’t truly know what the students know because they are just having AI do all of the thinking. They copy and paste the information (correct or not) and move on to the next task. I think to combat this it is important for staff and students to learn how to properly cite information when they use AI to help them. I think also asking more reflective questions from the students that are of a higher DOK level could help in this issue. I think what is most important is informing students of the issues with academic dishonesty and how to use it responsibly.

Overreliance and loss of critical thinking would be a concern. I remember researching college papers at the library and spending hours going through microfiche. Then the internet came along and made life easier when it comes to researching something. But now with AI, you don’t have to even think about how to research the topic. All you have to do is ask AI the question and it will generate all the information you are seeking. I’m not too sure if this is a great way of retaining knowledge of what you the teacher is trying to get you to learn.

Plagiarism and academic dishonesty are potential problems of misusing AI. Students might resort to AI to generate content without using critical thinking and their knowledge.

As an educator who has already seen a lot of usage of both apps and AI for academic dishonesty, I am very concerned with accountability for students. In the past I have had students cheat on every classwork using AI/Photomath/Mathway. But how you ask? It’s classwork! Liberal school policies allow for 5 days for the student to turn in work so they would take it home and return it finished and perfect. Then when the in class exam was given, students could not produce the same results. Parents were baffled. “Must be test anxiety!”, they say. As it stands now, school policies do not allow for accountability in daily formative work, only summative work. Though I can circulate and offer help, and observe the lack of daily effort, students still learning about consequences will often choose the path of least resistance and of immediate gratification.

Overreliance and loss of critical thinking:

I think that there will be an issue with our students on overreliance of using AI and that the consequences will be that will have a hard time coming up with critical thought and their own creative ideas. Having taught teenagers for so long, there will always be a group of students that try to take the easiest road possible. I think that really teaching about AI and the changes, issues and discussing how society will change will be important, so they understand the consequences of not having those skills. It will be more important than ever as we move forward with AI for students to have strong critical and creative thinking skills. Along with discussions about AI, I will set clear guidelines on using AI and demonstrate how to use it acceptably to enhance ideas. Really the most important thing is to teach them how to use it as a tool and not a replacement for their own thoughts.

Teaching with AI can promote overreliance and loss of critical thinking. As students are encouraged and expected to use AI tools, students develop a sense of dependency on AI even for simple tasks that do not require AI. In the manner, as students are provided with quick, effortless answers they lose the ability to process information. In addition, although AI can be a great tool, it is not bias free. To minimize the negative aspects of AI in student achievement, I like the suggestions mentioned previously. For example, to mix-up assessment design, to use project-based learning, oral exams, and other formats that require active demonstration of knowledge and skills.

The negative aspect of using AI for students is copying work that is not their own and passing it over as if it were theirs. Another negative is students’ overreliance on AI, which reduces their creativity and critical thinking skills. A way to minimize this is by setting clear rules, encouraging students to talk about AI, and be transparent. Another way is to design assignments that critical engagement personalize instruction and teach students about academic integrity.

The negative aspects of plagiarism and academic dishonesty are students becoming reliant on AI to do their work for them and lose the ability to think critically. In addition, grades will not be a measure of their understanding, but a measure of their ability to query the AI to get the desired response. A way to combat it will be to create more complex assignments that will require to synthesize information garnered from AI and make it so that “one and done” queries won’t work. Another way would be to incorporate AI into the assignment, but again to have students synthesize the information.

Analysis of Negative Aspects:
AI tools can undermine student accountability, it can become a shortcut. Students might use AI-generated answers instead of engaging with the material. This can hinder their ability to develop critical thinking, problem-solving skills, and a sense of responsibility for their learning. Excessive reliance on AI can challenge assessing genuine student understanding and effort.

To address these concerns, I would Require students to reflect on their process, such as explaining how they used AI and how it contributed to their understanding. I could also incorporate tasks that require personalized or creative outputs, such as photography projects or critiques, where AI serves as a support rather than a substitute for their original thinking.

I think to mitigate the overreliance on AI and the loss of critical thinking, we will need to change the types of questions we are asking. This will also help with the plagiarism and academic dishonesty. We can create assignments for our students where they are required to utilize AI and reflect on their experience (such as in the last module). We can also ask though provoking questions that AI won’t be able to answer. As a math teacher, I can also require students to explain to me how they found their answer.

I like the idea of mixing up the assessment types. This way students cannot rely on AI. You can use PBLs, have students model their learning, and orally share their knowledge to avoid cheating. Another way to help avoid cheating would be to teach the students to use AI as a tutor as we did earlier in our learning.

My primary concern is overreliance and loss of critical thinking. Students have already tried to just plug assignments into AI and copy the results with no thought to if the answer is reasonable or not. My first step should be to come up with an AI policy to start to outline appropriate uses. Next I want to start finding places to implement AI with students so we can start to learn proper use and understand how to use the results correctly. I have a number of upcoming lessons that have significant amounts of data analysis that could lend themselves really well to some introductory AI use that I think would be a great starting point.

Plagiarism and academic dishonesty is how I initially experimented with copilot, and while initial efforts were rebuffed, with a little tinkering it was far too easy to bypass the built in safety rails. The previous page mentioned restructuring assignment design to create more opportunities to demonstrate student learning and mastery, which is perfect for training I am currently undergoing in project based learning. I think shifting the assessment process (and learning process for that matter) towards a more applicable set of driving questions and project developed model will help to limit the opportunity to plagiarize. We are currently in the process of deploying PBL as a constant instructional model, and while the initial outlay of work is daunting, this does necessitate students actually internalize the work and demonstrate their understanding in a more demonstrative fashion. In discussing the issue with colleagues, the use of on-demand writing also helps to address the issue of submitting generated products as original work, so there are ample opportunities to circumvent the AI issue.

Loss of critical thinking: This was something I was really concerned about, then reading through “4 Balance: Realize the benefits of AI and address the risks” in a previous lesson, I realized that critical thinking skills are needed to properly use AI to determine if the output is substantial, accurate, and unbiased.

For the “Overreliance and loss of critical thinking” topic, I think as long as we are willing to have an A.I that is specifically prompted to aid in student acquisition of knowledge and not simply make it a copy-and-paste assignment, it can be useful in prompting critical thought, not replacing it.