Description: A beekeeper has an interest in the healthiest bee population possible. To achieve this, traditionally the hive must be inspected regularly by the beekeeper. This causes stress for the bees. Other stressors include diseases, parasites and pesticides. Environmental factors affecting the health of bee population are weather conditions, microclimate, plants in the environment and flowering time. The condition of a bee population can be understood from their behavior and activity. By correlating their behavior with other known information (buildings, plants/pollen, weather) we can infer the most favorable conditions for the placement of hives, and to make sure that bees operate in a healthy environment. Having bees’ populations healthy is particularly important nowadays, due to climatic change which is threatening life on earth as we know it, via the possible biodiversity collapse.
In this project, we employ state-of-the-art sensing technologies (noise, temperature, humidity, optical and thermal cameras, camera traps), together with state-of-the-art computer vision technologies (deep learning, DL) and remote sensing (aerial photography from drones) to create smart beehives, monitoring bees’ behavior and population numbers in real-time, examining in real-time potential threats to their colonies (anomaly detection). This allows to understand their well-being, react fast in dangers, especially in relation to climate change, making the life of beekeepers easier. Remote sensing allows to map the nearby environment (e.g. up to a distance of 5 kilometers) in terms of plants/flowers while DL allows to monitor the movement of bees inside and in/out of the bee hive, observing which type of pollen (and correspondingly, flowers) they carry/visit. Finally, camera traps will be strategically placed near significant sources of nearby flowers, to understand where and when bees travel to locate food, as well as whether this creates conflicts with nearby (wild-)bees located in the larger area under study.
The high-level objectives of the project are the following:
Collaboration with: CYENS LEAR MRG, CYENS MakerSpace, EMME-CARE Center
Techniques used: Internet of Things, Computer Vision, Deep learning, GIS and geospatial analysis, Aerial photography, Machine Learning, Open hardware.
Started: March, 2022
Status: On-going