Leveraging AI Chatbots to Support Problem Solving Skills
Adults can do much to encourage learners to view academic projects as platforms for meaningful reflection, theorizing, and self-expression. Essays and projects can give voice to who students are, what they think, and how they feel.
If AI chatbots are here to stay, is critical thinking for neurodivergent students at risk?
In a previous article, we provide an overview of some risks associated with using AI writers for school assignments, as well as guidelines on how to leverage them effectively as learning tools.
In a follow-up article on how to use AI text generators for research, we mention specific ways coaches and mentors can guide students who are embarking on AI-assisted academic projects.
Today, we explore the importance of helping students preserve a beneficial level of cognitive challenge, vet reference content for quality, incorporate their unique perspectives into writing assignments, and enhance their problem-solving skills while utilizing AI content generators.
No effort = no learning
In his Psychology Today article on the sweet spot for optimum learning, neuroscientist Stanislas Dehaene asserts, “A passive organism learns almost nothing, because learning requires an active generation of hypotheses, with motivation and curiosity.”
Prioritizing effort and grit may feel counterintuitive to neurodivergent students. They may struggle with ADHD, autism spectrum disorder, or other learning differences that hamper executive functioning and lead to decision fatigue. However, the right amount of cognitive struggle helps them access peak learning.
AI content writers passing as college students
In a UCLA study, psychologists concluded that although its ability to solve problems contains some limitations, ChatGPT’s abstract reasoning performs comparably to that of undergraduate students on standardized tests and intelligence assessments.
Rising Harvard sophomore Maya Bodnick conducted her own research, revealing ChatGPT’s ability to take her freshman year classes, write its own essays, and satisfy her professors’ standards enough to achieve a GPA of 3.3. “I basically found that ChatGPT is a college-level writer.” In Bodnick’s test, it did well with both formulaic and creative content, but failed where it had not been trained to provide a very specific answer.
The hazards of outsourcing effort
Setting aside the ethics of “cheating,” if a task is too easy, students don’t grow. If it feels too difficult, they become overwhelmed and may give up. Stepping outside their comfort zones and applying reasonable effort to assignments, including risk taking and reworking an approach after failure, allows students opportunities to rehearse and refine cognitive skills.
Stressed learners may feel tempted to submit the immediate AI output verbatim, or at the very least, assume that it is good enough to paraphrase. The smooth content produced by AI tools may look great at first glance, and students may not recognize that initial responses always require analysis and revision. They also may not recognize how academic shortcuts will cause them harm.
Don’t let your learner circumvent opportunities to grapple with challenging problems
Academic skill building occurs when we don’t immediately know the answer, but arrive at new insight through manageable levels of effort. The impulse to lessen mental exertion in service of mere achievement is a strategy that will backfire when studentd face similar hurdles in the future.
Relying on AI to do the heavy lifting bypasses students’ critical thinking. It robs them of an opportunity to build reasoning skills that will aid them in embracing challenge. Furthermore, part of critical thinking is recognizing when a source may be biased or inaccurate.
Just because it comes from a machine doesn’t make the output high quality or factual
Travel writer Cameron Hewitt tested ChatGPT’s competence at generating a guidebook listing for Hillsborough Castle in Northern Ireland, offering the chatbot several chances to refine its output. Hewitt then compared the results with his own listing.
He concluded that the AI generated text was lacking in personality, full of cliches, and reminiscent of advertising copy. “Even though I gave it feedback to tone down the promotional tone, the AI couldn’t resist making its listing sound like a commercial for Hillsborough Castle…It’s not surprising that AI defaults to a less nuanced, more actively promotional approach — because the vast majority of travel content out there is exactly that.”
Cameron also noted that ChatGPT incorporated outdated details into its guidebook listing, including out-of- date ticket prices and references to a living Queen Elizabeth.
Students using AI chatbots also encounter incorrect and distorted search results, but without the benefit of adult reasoning and fully developed executive functioning.
Inaccuracy, bias, and quality control proficiency
Along with the aforementioned hazards of bypassing a healthy cognitive struggle, students face additional risks when relying on the authority of skewed or nonfactual content from AI text generators:
- They may not employ analysis of the output they receive.
- They may not leverage research skills to vet content for quality, fact check for accuracy, or use authoritative sources for verification.
- They may not bring sufficiently developed perspective to a subject or its context to recognize incorrect or biased information.
While learners may inadvertently trust faulty ideas and incorporate them into final drafts, educators can teach students to employ healthy skepticism, understand media bias, and evaluate their sources.
Your student as meaning generator
As navigators of the digital age, neurodiverse learners need to be guided in discerning the highest quality and most personally relevant ideas to include in academic projects.
Nothing can replace a student’s own opinion, perspective, and voice. These cannot be outsourced to an AI bot. Active learning occurs when a student generates meaning by synthesizing existing knowledge and constructing a unique frame to an academic inquiry. If their critical reasoning skills go untested, they can easily develop the belief that they are not capable.
AI text writers are handy for scraping the internet for factoids and providing inspiration. However, students need to scour their own memory banks and weave in what is relevant.
Educators can empower students by supporting them through problem-solving hurdles, which will foster in them the confidence needed to create solutions to life’s future challenges.
How can students leverage AI tools while maintaining optimum levels of rumination, scrutiny, and deduction in academic projects?
Solutions and Tips: Coaches and educators must help students understand how to use AI tools such as ChatGPT to aid, rather than replace critical thinking.
- Teach students to use discernment in vetting all types of prose for veracity and cultural bias, including identifying the sources’ credibility, affiliations, and agendas.
- Before they begin using an AI chatbot, help students pinpoint the priorities of their research, such as which facets of a topic they consider most relevant and of greatest value in light of their personal experience and observations.
- Guide students to view AI writing generators as research tools, not writing prescriptives. Emphasize that initial chatbot output is a starting point and is not a reliable authority on any topic.
- Help learners convert academic prompts into meaningful research inquiries.
- Encourage students to apply their vetting skills in analyzing output from AI writers, recognizing relevant portions, and weeding out inaccuracies and slanted perspectives.
- Teach learners to craft iterative queries and effective follow-up questions, incorporating their discoveries and pivoting their angles as needed, while using ChatGPT for research.
- Get in the weeds and mentor students in navigating concrete challenges. Model your own thinking process via a research example, and then guide learners through one or more AI inquiry exercises until they demonstrate proficiency in using the tool.
- Classroom teachers can invite students to design several iterative questions on a single topic using an AI chatbot, capture the resulting dialog, and present a critique of process.
- Delve into the risks of AI-assisted research. Have students try to generate biased or inaccurate chatbot responses based on intentionally ineffective prompts.
- Guide students in using personal challenges, strengths, and opportunities as fodder for exploration in academic projects.
Coaches and educators can help students integrate AI into their arsenal of tools, but not allow it to usurp their innate problem solving capabilities.
Is your student looking for guidance in using research technology to maximize their academic confidence? Schedule a complimentary information session.
As an executive function coach and academic tutor, I specialize in helping individuals with learning differences exceed their goals for academics, organization, and college transition.