Goldfish in a tank doing a number task


There has been a recent surge of interest in studying fish cognition. Fish perceptual and cognitive abilities compare very well to other vertebrates on most tasks (Brown, 2015). For example, there are parallels between fish and non-human primates on many social cognition phenomena such as individual recognition, cooperation, eavesdropping, social learning, cooperative hunting, cheating, punishment, and altruism (Bshary et al. 2002). We are studying a number of topics including visual object perception, perceptual constancy, and numerical perception in goldfish in our lab at RIT. We investigated the visual features used by fish to discriminate among objects such as geometric and complex shapes (DeLong, Keller, Wilcox, Fobe, & Keenan, 2018). In another line of research, we presented goldfish with a relative quantity judgment task to assess their numerical abilities. Our study suggested that fish given extensive training (over 1000 trials) can achieve accuracy on a numerical task comparable to well-trained birds, humans, or non-human primates (DeLong, Barbato, O'Leary, & Wilcox, 2017). We are currently examining the ability of fish to identify 2D and 3D objects despite changes in orientation (DeLong, Fobe, O'Leary, & Wilcox, 2018) .

Dolphin just below the surface of the water


Dolphins produce an array of sounds including echolocation clicks and whistles (Au, 1993). There are many acoustic features that can be measured in echoes and whistles (e.g., duration, target strength, peak frequency, spectrum shape, bandwidth). However, the salient acoustic features used by animals utilizing echoes and whistles in certain tasks can be difficult to identify. One approach we have used to identifying the acoustic features in echoes that dolphins may use to discriminate among objects is to present a dolphin with an echoic discrimination task, then measure the object echoes and statistically analyze acoustic differences in the objects. These between-object differences can be examined in conjunction with the dolphin’s errors to identify the acoustic features the dolphin may have used in the task (DeLong et al., 2006). Another approach we have used is to record echoes from objects used in a discrimination task with a dolphin subject and use artificial neural networks to identify echoic cues that enable objects to be recognized (DeLong et al., 2014). A third method for detecting salient acoustic features we have used frequently in the CCP Lab is to compare the performance of cetaceans and humans on the same auditory perception tasks. In a typical study, a dolphin and a group of human participants are asked to discriminate among the same set of objects (Branstetter et al., 2016). Across numerous studies, we have found that humans listening to prerecorded echoes typically perform as well or better than dolphin subjects and report discriminatory cues (for a review see DeLong, 2017). Human listening studies reveal echo auditory features that are likely to be salient for dolphins as well as processing mechanisms and decision strategies that may be used by dolphins. These studies can also assist in conservation efforts (Morrison, DeLong, & Wilcox, 2020). Another line of research in the lab is focused on understanding how dolphins can visually perceive objects from different aspect angles (DeLong, Fellner, Wilcox, Odell, & Harley, 2020). 

An otter looking at a stick


There has been little research done on visual perception in North American river otters. The goal of one of our research studies was to explore visual features of objects that are important to these animals. This project took place at the Seneca Park Zoo (Rochester, NY). We found that river otters can discriminate between objects varying in color or shape (DeLong, Wright, Fobe, Wilcox, & Morrison, 2019). Understanding more about visual features otters use to discriminate objects will significantly increase our knowledge of these animals, which could help in conservation efforts. River otters nearly disappeared from NY and other states by the 1960’s due to water pollution, the elimination of wetlands, and unregulated trapping. They were reintroduced to Western NY by capturing otters from the Adirondacks in 1995 and releasing animals along the Genesee River. But threats remain for otters – water pollution, accidental trapping in beaver traps, and disappearing wetlands. We need to know more about how otters see the world if their population levels decline and further conservation efforts are needed. This research also provides benefits for otters under human care. It will help us design better enrichment devices for zoos and aquariums. The project was filmed for a BBC Natural World documentary on otters. The BBC film crew was fantastic and we had so much fun filming the project at the Seneca Park Zoo! We are currently investigating visual object categorization in North American river otters.

Orangutan in a cage with fingers gripping the cage


Non-human primates such as orangutans, chimpanzees, and gorillas can use tools to solve problems. Our research has been focused on tool use in Bornean orangutans at the Seneca Park Zoo (Rochester, NY). One set of studies examined the ability of the orangutans to use stick tools to move a food reward through a series of maze configurations in a puzzle box (Keller & DeLong, 2016; O'Leary & DeLong, 2016). The performance of the orangutans was compared to a group of human children. In one study we investigated prospective cognition (planning ahead) and in another study we explored observational learning. Another study considered the orangutans’ ability to use water as a tool to extract a peanut from a transparent tube. The water tube task, adopted from Mendes et al., (2007), was a novel task for the two captive orangutans who were tested in this study. Both our orangutans were successful, but only one animal solved the task in the most difficult condition. These results show that Bornean orangutans are capable of using water as a tool (DeLong & Burnett, 2020).

A group of black and white penguins


This research focused on the perception of rhythm in African penguins. Rhythm in penguin vocalizations may carry important information about individual identity in nesting and non-nesting species (Aubin & Joventin, 2002; Favaro et al. 2014). African penguins’ longest vocalization, the ecstatic display song, starts with a chain of short syllables followed by one or two long syllables and one inhalation syllable. Rhythmic components of the ecstatic display song seem to be unique to individuals. Rhythm can carry information in large penguin colonies where sound is readily masked and attenuated. Rhythm remains unchanged over long distances. This project took place at the Seneca Park Zoo (Rochester, NY). We used a habituation method to investigate whether the birds can discriminate between different rhythmic sequences. We did not find conclusive evidence for rhythm perception in our first study (Fobe & DeLong, 2018). If African penguins are able to discriminate rhythm, this information could be used to bolster conservation efforts for this species. African penguins are listed as endangered due to overfishing, oil spills, habitat loss, and entanglement in fishing equipment.

Baboon performing a test through a metal grate


We are collaborating with researchers at Carnegie Mellon University on The Primate Portal.  The Portal is located at the Seneca Park Zoo where we have trained a troop of Olive Baboons (Papio Anubis) to solve cognitive problems using touchscreen computers. All of the baboons' responses are recorded and the sessions are videotaped and live-streamed on YouTube. We engage in K-12 STEM Outreach, working with students at local schools to code games for the baboons. We are also developing tools to facilitate data visualization and analysis of our data for younger students. The goal is that data from the Primate Portal can be used in elementary and high schools to facilitate data literacy, develop student science and coding projects, and inspire scientific discovery.