ademnea@cit.ac.ug +256 701909833

Name: Sensors and signal processing.

Partners: MAK, UoJ

Duration: M1-M60

Description:

Design and implementation of sensors, supporting electronics and data processing methods for: 1) automatic real-time identification, counting and tracking of insects that are key agents in the pollination process (honey bees, solitary bees, bumble bees, carpenter bees, hoverflies, butterflies) or crop pests (ants, beetles, caterpillars, army worm, mites, locusts, aphids), and 2) monitoring of the ambient environment variables where the insects appear, primarily air temperature, relative humidity, wind (speed and direction), light intensity, precipitation and soil moisture. These data shall be stored and integrated in digital platforms that facilitate analysis of temporal and spatial species abundance and diversity, improvement of insect management, reduction of pesticide usage and institution of conservation measures for pollinating insects.

Methods and Tools

Recent technology for identification and tracking of insects utilizes: 1) audio spectrum analysis to associate audio signatures with key insects [1,2], 2) machine learning-based insect recognition [3-7], supported by still images/video captured with cameras, and 3) radar [8-10], RFID [11-13] and video [14] based tracking of insects. Current implementations of these technologies are expensive and untenable for ordinary farmers in Africa. This work will leverage recent advances in miniaturized sensors, embedded computing, and machine learning to develop low-cost platforms that achieve robust insect recognition and tracking. The devices developed will be connected to WSNs to facilitate real-tie surveillance, and cloud computing for analytics toward effective decision making for farmers and Governments. Regarding ambient environment variables, the WIMEA-ICT project [15] has yielded a state-of-the-art Automatic Weather System that records air temperature, relative humidity, wind (speed and direction), light intensity, precipitation, and soil moisture. We will piggyback on the output of [15], with focus on building energy efficient nodes that integrating self-learning for transmission power control, data dimensionality reduction, and gateway synchronization.

Potential innovation: This work will demonstrate the development of novel low-cost data acquisition methods and tools for capturing insect-specific audio/image/video signals, spatial localization data and ambient environment variables, and novel machine learning models that utilize this data for automatic insect recognition, counting and tracking.

Deliverables: Reports on solutions, methods and tools are specified in the task descriptions below.

Interdependencies:WP2 will interact with WP1 (to achieve energy-efficient and resilient node design) WP3 (entomologist guided hardware designs targeting specific insect species, and data labeling toward supervised learning).

Resource requirements: [2 PhD students, 1 post-doc, 4 MSc students, sensors and accompanying electronics hardware, computing hardware for machine learning experiments, field experiments for solution deployment]

 

Task 2.1 Insect Identification from Audio Spectrum Analysis

Partners: MAK, UoJ

Duration: M1-M60

Description of work: Design and implementation of audio-based methods and tools for identifying key flying insects in a challenging (noise-prone, low amplitude sound), data-scarce domain. The tools and methods developed will facilitate estimation of insect population density, diversity, and distribution across natural and agricultural landscapes.

The key activities include:

  1. Development of an intelligent audio trap for flying insects (honeybees, solitary bees, bumble bees, carpenter bees, hoverflies, butterflies, locusts, aphids). The system will consist of Raspberry Pi [16], a low-cost and portable computing platform, equipped with a 12.3-megapixel Sony IMX477 camera sensor with a custom-built night vision capability [17] and the snowball USB (up to 48kHz) sound sensor [18]. A generic insect detection machine learning model will be developed using open source insect datasets [19-22], bundled on the Pi’s Linux operating system, and installed on a custom-built data logger. When an insect is detected in the camera’s near-field line of sight (<200 mm), the microphone will automatically record its audio signature (triggered by image-based insect recognition) for a 2 second duration. For field-deployment, the system will be continuously solar-powered with a 12 V 50 W polycrystalline solar panel and 12 V 16 Ah lithium ion battery. The control interface for the data acquisition system will be accessible via USB connection (tethering) or a wireless link (WLAN) to a smartphone or laptop, where the audio can be streamed in real-time, the system can be tuned, and data can be downloaded remotely. The system will also integrate a 32GB SD card for local data storage.

  2. Labeling of audio data, by expert entomologists, supported by corresponding image data.

  3. Development of robust deep learning model, based on convolutional neural networks, for automatic classification of insect species from audio data. Validate by comparison with traditional machine learning-based approaches such as Random Forests and Support Vector Machines.

  4. Deployment of the developed model on hardware and validate the system.

  5. Development a diurnal pattern of the different insect species.

Potential innovation: Low-cost sensor hardware, novel machine learning models trained on audio signatures for insect identification.

Deliverables:

D2.1.1 Equipment and algorithms for recording and automatic recognition of insect acoustic signatures.

D2.1.2 Openly available labeled audio data repository for insect pollinators and pests (target at least 10,000 audio files for each insect class).

Interdependencies: WP1 and WP3.

Resource requirements: [specify reliance on students, field work, infrastructure, equipment]

 

Task 2.2 Insect Recognition and Counting from Image/Video Data 

Partners: MAK, UoJ

Duration: M1-M60

Description of work: Design and implementation of image-based proximity sensing methods and tools for automatic detection and counting of key flying/crawling pollinating and pest insects. The tools and methods developed will facilitate estimation of insect population density, diversity, and distribution across natural and agricultural landscapes.

The key activities include:

  1. Development of an Intelligent insect trap flying and crawling insects. For crawling insects, the trap will be weather-proof and ground-deployed, and a near-field camera sensor will be set up facing the ground. When an insect is detected in the camera’s field of view, the camera will automatically trigger recording of video data (2 seconds). The recorded video will be stored as time-compressed movies, using custom-built video compression algorithms, for local and remote access.

  2. Labeling of video data by expert entomologists to facilitate supervised learning.

  3. Development of a multi-class insect recognition model, which identifies an insect species and localizes it in an image frame. The model will be developed using state-of-the-art deep learning object recognition models such YOLO[23] and Retina-Net[24] as benchmarks. Computational cost of the developed model will be optimized to facilitate real-time performance.

  4. Deployment of the learned model on the computing platform for smart identification of the target inspect species by class, and for each class, give a count estimate.

Potential innovation: Low-cost sensor hardware, novel machine learning models trained on image/video for insect identification.

Deliverables:

D2.2.1 Equipment and methods for insect-triggered video capturing

D2.2.2 Openly available labeled image data repository for insect pollinators and pests (target at least 10,000 images for each class).

D2.2.3 Multi-class learned model for robust and accurate insect recognition.

D2.2.4 Deployed intelligent insect detection and counting system.

Interdependencies: WP1 and WP3

 

Task 2.3 Telemetry Tracking of Insects 

Partners: NTNU DIT, MAK, UoJ

Duration: M1-M60

Description of work: Design and implementation of a light-weight long-range insect tracking system utilizing passive radio frequency identification (RFID) tags. RFID offers the advantage of rapid and simultaneous insect detection and is less disruptive on insect behavior given the small size of the tags (target of <30% of body weight). The proposed system will utilize passive tags (not equipped with a power source but deriving signal power from the reader). The system will support insect detection when an insect passes in proximity (up to 10 meters) of the reader. The developed system will be utilized to track honeyless and stingless bees in managed and wild nests.

They key activities include:

  1. Development of passive miniaturized RFID tags

  2. Design of tag placement protocols on insects

  3. Development of high-resolution long-range reader hardware based on Ultra High Frequency (860–960 MHz) radio.

  4. Development of the embedded software, communication gateways and a watchdog system.

  5. Development of machine learning based outlier detection models for false positive filtering

  6. Integration, deployment and testing of the tracking system on honeybees.

Potential innovation: Intelligent, accurate and low-cost RIFID based solution for insect tracking

Deliverables:

D2.3.1 Methods and tools for passive RFID insect tagging and detection.

Interdependencies: WP1, WP3

 

Task 2.4 Sensor Gateway and Weather Parameters 

Partners: MAK, DIT, UoJ, NTNU, UoB

Duration: M1-M60

Description of work: Design and implementation of a node with an in situ gateway, for transmitting data to the Internet, adding location, time stamp, and the weather parameters: air temperature, relative humidity, wind (speed and direction), light intensity, precipitation and soil moisture. The node shall adaptively minimize resource use to increase the node’s life, reliably buffer received data and reliably transmit it even with intermittent network conditions. The node shall provide for multiple uplink options. This work will build on the results from WIMEA-ICT project. The weather parameters will be utilized to analyze the effect of weather patterns on insect species’ abundance and diversity in agricultural landscapes.

They key activities include:

  1. Design and implementation of node hardware (requisite sensors, electronic circuitry interfacing sensors to the gateway, the gateway and power supply).

  2. Development of node software, integrating self-learning for transmission power control, data dimensionality reduction, and gateway synchronization.

  3. Development of a representation tool for data management.

Potential innovation: Energy efficient, integrated, and resilient weather monitoring nodes

Deliverables:

D2.4.1 Hardware for sensor nodes and gateway

D2.4.2 Machine learning framework for transmission power control

D2.4.3 Algorithms for data dimensionality reduction and gateway synchronization.

D2.4.4 Data representation tool.

Interdependencies: WP1 and WP3

Other Work Packages

Networks and Resilience(WP1)

VSensor data must be collected from both static and mobile sensors in the field and then aggregated by nodes which are energy-constrained either because they rely on batteries or on local power sources such as solar panels.

Sensors and signal processing(WP2)

These data shall be stored and integrated in digital platforms that facilitate analysis of temporal and spatial species abundance and diversity, improvement of insect management, reduction of pesticide usage and institution of conservation measures for pollinating insects

Data Analytics for Environment Monitoring services(WP3)

Development of automated information collection for insect pollinators and pests. And also for the first time combined utilization of large weather information data sets in insect pollinator conservation planning and pest control method design