Abstract
Ecological studies of pollinators often require long-term and extensive monitoring, posing a significant cost and limitation to research. Traditional sampling methods of observation such as sweep netting and pan trapping provide valuable information on pollinator diversity, but scale poorly when conducting large sampling efforts across space and time. We introduce "buzzdetect," a tool to apply deep learning models to audio data for passive acoustic monitoring of pollinators, and test our accompanying audio classification model. The model is capable of distinguishing the buzzing of insect flight from environmental noise on a second-by-second basis with a sensitivity of 28% and a precision of 95%. As a demonstration of the value of buzzdetect, we apply the tool to recordings from 4 crops and 1 wildflower. The results reveal differences in timing and intensity of foraging that correspond with prior literature: activity peaked earliest for chicory and latest for soybean, while total activity was higher in mustard and soybean than in the other crops.