• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Professor at Texas A&M flunks class because of AI

Orbit

I'm a planet
My university started today, and this was brought to my attention by one of the IT people. A professor at Texas A&M University failed his students because he pasted their final essays into ChatGPT and asked it if it had written them. Of course ChatGPT said yes (which was incorrect, that's not how ChatGPT works), so he failed the class.

This got me thinking about AI and education in general. Just this semester, we put verbiage related to AI in our Academic Integrity Policy on our syllabus template, but the reality is, students can choose to cheat and most of the time, we won't catch them. Can they cheat their way through an entire college education? Should we just do online classes where the AI professor and the AI students talk to each other? What does AI mean for university education?
 

dybmh

דניאל יוסף בן מאיר הירש
ChaptGPT has an identifiable writing style. My daughter is an English major and had some classmates try to cheat with ChatGPT and were caught. According to my daughter the professor warned them in advance.

Maybe the way to police it is to pretend to be a cheating student; ask AI to write the essay a few times, and read those AI produced essays to get a feeling for the way it answers the questions. It won't answer exactly the same way each time, so multiple fake essays would be needed.
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
My university started today, and this was brought to my attention by one of the IT people. A professor at Texas A&M University failed his students because he pasted their final essays into ChatGPT and asked it if it had written them. Of course ChatGPT said yes (which was incorrect, that's not how ChatGPT works), so he failed the class.

This got me thinking about AI and education in general. Just this semester, we put verbiage related to AI in our Academic Integrity Policy on our syllabus template, but the reality is, students can choose to cheat and most of the time, we won't catch them. Can they cheat their way through an entire college education? Should we just do online classes where the AI professor and the AI students talk to each other? What does AI mean for university education?
I think its going to be a serious issue in the coming years.

You don't want dumb as a rock students using ai to do their homework for them.
 

Heyo

Veteran Member
My university started today, and this was brought to my attention by one of the IT people. A professor at Texas A&M University failed his students because he pasted their final essays into ChatGPT and asked it if it had written them. Of course ChatGPT said yes (which was incorrect, that's not how ChatGPT works), so he failed the class.

This got me thinking about AI and education in general. Just this semester, we put verbiage related to AI in our Academic Integrity Policy on our syllabus template, but the reality is, students can choose to cheat and most of the time, we won't catch them. Can they cheat their way through an entire college education? Should we just do online classes where the AI professor and the AI students talk to each other? What does AI mean for university education?
This points towards a problem that is older than AI. Is the evaluation process adequate if it can't discern a cheating student from an honest one? After all, could not a student proficient in using AI become a valuable professional using AI?
The professor was clearly not proficient in using AI.
 

dybmh

דניאל יוסף בן מאיר הירש
there's several AI content detectors out there. this one appears to be well reviewed. more better offerings are sure to be developed.

 

Nakosis

Non-Binary Physicalist
Premium Member
My university started today, and this was brought to my attention by one of the IT people. A professor at Texas A&M University failed his students because he pasted their final essays into ChatGPT and asked it if it had written them. Of course ChatGPT said yes (which was incorrect, that's not how ChatGPT works), so he failed the class.

This got me thinking about AI and education in general. Just this semester, we put verbiage related to AI in our Academic Integrity Policy on our syllabus template, but the reality is, students can choose to cheat and most of the time, we won't catch them. Can they cheat their way through an entire college education? Should we just do online classes where the AI professor and the AI students talk to each other? What does AI mean for university education?

Cheating been around long before AI. There exists site that provide pre-done essays on various subjects along with student critics of the different professors so you can pick one that grades easier or on a curve. My company would only hire college graduates but the were still as dumb as a rock just with a college degree.

A point being maybe companies will soon realize, why hire college graduates when AI can do the intellectual design work for you? Maybe just a team of folks with experience in AI inquiry.
 

Nakosis

Non-Binary Physicalist
Premium Member
As a retired teacher I can assure you that cheating is both rampant and unstoppable. Have a nice day.

Seems more prevalent in the US as we'd usually end up having to hire foreign educated applicants to find people who actually had an acceptable understanding of their field of knowledge. Either foreign students are less savvy about using the internet to cheat or foreign educators are more savvy about how to prevent such cheating.
 

Mock Turtle

Oh my, did I say that!
Premium Member
I assume this is likely to affect some areas much more than others, given that any STEM subjects are less easy as to faking knowledge, and especially in exams. I'm sure things have changed a lot since I took any exams but we then had to even memorise all formulae in all subjects, and therefore gave those an advantage with especially good or adequate memories. I doubt this is done now.
 

Orbit

I'm a planet
I assume this is likely to affect some areas much more than others, given that any STEM subjects are less easy as to faking knowledge, and especially in exams. I'm sure things have changed a lot since I took any exams but we then had to even memorise all formulae in all subjects, and therefore gave those an advantage with especially good or adequate memories. I doubt this is done now.

Actually, my husband, who is a computer programmer, gave ChatGPT a programming problem to see what it would do. It produced an answer that followed the rules and syntax of the programming language specified, but contained *commands* that didn't exist. In other words, it made stuff up. But anyway, my point is even STEM isn't safe.
 

Mock Turtle

Oh my, did I say that!
Premium Member
Actually, my husband, who is a computer programmer, gave ChatGPT a programming problem to see what it would do. It produced an answer that followed the rules and syntax of the programming language specified, but contained *commands* that didn't exist. In other words, it made stuff up. But anyway, my point is even STEM isn't safe.
But spotting any mistakes - as in your example - might be easier. ChatGPT seems to make up sources for references too. I've not used it yet.
 
Top