Readers ask about AI ethics, monkey tool use and more

The head and the heart

Scientists used light to raise a mouse’s heart rate, increasing anxiety-like behaviors in the animal. The study offers a new angle for studying anxiety disorders, Bethany Brookshire reported in “In mice, anxiety isn’t all in the head” (SN: 4/8/23, p. 9).

Reader Barry Maletzky asked why strenuous exercise, which elevates heart rate, doesn’t typically induce anxiety.

Heart rate isn’t everything, says neuroscientist Karl Deisseroth of Stanford University. The heart may race, but the brain provides important context, which is key to the body’s response. In the study, elevating a mouse’s heart rate in a neutral environment — such as a small, dim chamber — did not induce anxious behaviors, Deisseroth says. The anxious behaviors increased only when the heart rate was raised in a threatening context, like an open space where a small mouse could be a snack for a predator.

Monkey business

Some macaques inadvertently made stone flakes while using rocks to crack open nuts, raising questions about whether ancient stone flake tools attributed to hominids were made accidentally, Bruce Bower reported in “Monkeys’ stone flakes look like hominid tools” (SN: 4/8/23, p. 13).

Reader Jerald Corman wondered how scientists knew that the monkeys created the stone flakes unintentionally.

We know this because the flakes were produced only when a monkey attempted to hit a nut with a rock and missed, says primatologist Lydia Luncz of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. The monkeys “pay absolutely no attention to whatever breaks off. They don’t pick it up. They don’t look at it,” she says. “When a stone breaks multiple times, they just pick a new one.”

AI ethics

The chatbot ChatGPT and other artificial intelligence tools are disrupting education, Kathryn Hulick reported in “Homework help?” (SN: 4/8/23, p. 24).

The material that ChatGPT generates is technically not considered plagiarism because it’s new and original, Hulick wrote. Reader Joel Sanet wondered about student papers that may have been written entirely by AI. Can a human plagiarize or steal the intellectual property of an inanimate object?

Plagiarism means passing off someone else’s work as your own. “If you claim that you wrote something, but it was actually written by a chatbot, that would be a type of plagiarism,” says Casey Fiesler, an expert in technology ethics at the University of Colorado Boulder. It’s important to remember that in most situations, plagiarism isn’t illegal, Fiesler says. In education, it’s almost always an honor code violation. “The most important thing is to be honest about how you’re using AI,” she says.

Intellectual property is a whole other, very interesting issue, Fiesler says. “The U.S. Copyright Office recently established that work created wholly by AI cannot be copyrighted because there isn’t a human author,” she says. “But I suspect that in the coming days, we’ll see a lot of discussion (and litigation) that tests the edges of ownership and intellectual property when it comes to AI.”


“Homework help?” incorrectly stated that Jon Gillham’s AI-detection tool identified 97 percent of 20 text samples created by ChatGPT and other AI models as AI-generated. The tool identified all 20 samples as AI-generated, with 99 percent confidence, on average.