7 things to know about using an AI tutor

When Mashable tested the newest AI learning products from Anthropic, Google, and OpenAI, reporter Chase DiBenedetto came to some surprising conclusions about what works best in these tools — and what doesn’t

Comparing Anthropic’s Learning Mode, Google’s Guided Learning, and Open’s AI’s Study Mode, DiBenedetto found that each model had its strengths and weaknesses. She also discovered that she benefited differently, based on her own learning preferences

She did the work so you don’t have to, but you might still be wondering: How can I set myself up for success, no matter which AI learning product I choose? 

Mashable spoke to experts in learning and AI, who shared their top tips for getting the most out of these tools by taking the following steps: 

1. There’s no evidence AI will improve your grades

Before you rush to turn the learning versions of ChatGPT, Claude, or Gemini into your favorite study buddy, consider their track record. 

Robbie Torney, senior director of AI programs at Common Sense Media, says that it’s still too early to have robust research demonstrating that AI tutors, in particular, boost learning outcomes or academic performance. Instead, supporters of AI learning products will likely point to studies that show how tutoring itself is very effective at increasing student achievement. 

But experts still can’t say for certain whether the AI behind ChatGPT, Claude, or Gemini can help you master a tough subject or change your fortunes in a challenging class.

That’s why Torney recommends “being clear-eyed about why we’re using AI and not just using AI for the sake of it.” 

2. Don’t assume the AI tutor is right 

AI chatbots are designed to produce answers that feel authoritative, but they are fallible and can be biased. Even in math, a subject that might seem less prone to errors, AI chatbots provide the wrong answer sometimes. 

Similarly, if you’re relying on an AI learning product to create study aid flash cards, be sure to check the facts twice for mistakes and hallucinations.  

“All of these tools still generate incorrect information that sounds plausible but isn’t true.”
– Robbie Torney, Common Sense Media

“All of these tools still generate incorrect information that sounds plausible but isn’t true,” Torney says. 

When studying topics that involve the analysis and interpretation of ideas and facts, like politics and history, AI chatbots may also incorporate ideological, racial, gender, and other types of bias into its content. 

Be sure to regularly check that you’re not giving an AI chatbot more credence than it has earned. 

3. There are risks to AI tutors

Using an AI learning product is not risk-free. As a general rule, you should avoid sharing personally identifiable information with it, as some chat logs have published on the internet or been made publicly available online. 

Dr. Julie Schell, assistant vice provost of academic technology at the University of Texas at Austin, believes there are many ways to use AI for learning. But if students are looking to offload cognitive tasks, they should stick with delegating the most tedious ones to AI, like organizing notes or creating outlines of material. The end goal isn’t to have AI do work on your behalf, but to save yourself time for deeper and more meaningful learning. 

You should also beware of turning your AI learning tool into an emotional confidant, because the product may have the potential to pull you deeper into isolation or even despondency. 

In August, a wrongful death lawsuit was filed against OpenAI, arguing that a teen user, Adam Raine, had initially turned to ChatGPT-4o as a homework helper but over time began discussing suicide with the chatbot. 

Raine ultimately received and followed directions from the chatbot for how to die by a specific means, according to the lawsuit. OpenAI said it is “working to improve” certain safeguards, particularly for users who engage in long conversations with ChatGPT. 

ChatGPT-4o remains available to the public, if the user selects that model.  

4. Have an experimental mindset when using AI for learning

Torney encourages students to try out different models to better understand what features are a good fit for them. The models currently available have different characteristics and abilities based on the learning tasks required of them, he adds. 

After testing Claude’s Learning Mode, Gemini’s Guided Learning, and ChatGPT’s Study Mode, Mashable’s Chase DiBenedetto reported on specific features worth evaluating, such as quizzes, flashcards, and conversational style.  

5. Don’t just use AI to help you memorize information 

Schell says that students often think they have to rehearse content if they want to learn it. Instead, she says information must be retrieved in different contexts so that it becomes meaningfully integrated in one’s memory. 

So if you’re hoping that an AI learning product can help you grasp specific concepts or learn facts through memorization, Schell recommends a different strategy. 

One technique is to prompt the product to mix up the content when quizzing yourself, so it’s presented in various contexts. Think, for example, of more open-ended questions that require applying the material to a problem or explanation, rather than the standard set of quiz questions. 

Schell, who serves on Anthropic’s Higher Education Advisory Board, says this should lead to more effortful retrieval, which can be very beneficial to learning. 

Additionally, while Schell recommends using AI to create flashcards, she says students should provide the answer first, before looking at the back of the card. This, too, can help maximize information retrieval. 

6. Be mindful of the learning pit

Even if you’ve never heard of the so-called learning pit, you probably recognize the concept by its description. 

Schell says the learning pit is the figurative place students land when they’re struggling with a difficult concept, realize how much they don’t know, and feel like they just don’t get it — and might never understand the material. 

“Students tend to give up when they’re down at the bottom of that pit, but that’s like the worst place to give up, because you’re almost about to come out of it,” Schell says. 

If you’re using AI with the hope that it’ll get you out of the learning pit and even that’s not working, Schell says not to feel like a failure. 

In general, she recommends learning to “tolerate the struggle,” in addition to reminding yourself that this is an unavoidable part of learning. 

Also, if you’re desperately clinging to AI in the pit, that’s exactly the time you should look for human support.

7. Always have a human in the loop

Schell says that students should always be getting feedback from other humans — friends, classmates, teachers — about their learning, especially when using an AI tool as a study aid. 

Torney agrees, calling AI a “supplement” to a student’s learning process. 

Skilled teachers, for example, can often identify why a student is struggling and offer different approaches, because they know that individual well, Torney says. 

Schell recommends students look to peers who’ve just mastered a concept. They’re in a position to explain it well to someone who’s at the learning stage they just left. 

Overall, Torney says that AI learning tools are probabilistic machines, and not based in what’s known as pedagogy, or theories of learning. They just can’t replicate the social and collaborative nature of learning that students have known for generations. 

“This can be part of a tool kit that helps a student learn, but this is not going to replace…holistic, well-balanced learning experiences that are rich and meaningful,” Torney says.

​Mashable

Leave a Reply

Your email address will not be published. Required fields are marked *