Evaluation on the generalization of coded features across neural networks of different tasks
- Resource Type
- Conference
- Authors
- Liu, Jiawang; Liu, Ao; Jia, Ke; Yu, Hualong; Yu, Lu
- Source
- 2023 IEEE International Conference on Visual Communications and Image Processing (VCIP) Visual Communications and Image Processing (VCIP), 2023 IEEE International Conference on. :1-5 Dec, 2023
- Subject
- Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Signal Processing and Analysis
Visualization
Image coding
Visual communication
Bit rate
Feature extraction
Multitasking
Task analysis
feature generalization
feature compression
collaborative intelligence
- Language
- ISSN
- 2642-9357
Recent advances in deep neural networks (DNNs) for computer vision tasks have made intelligent analysis on edge devices more prevalent and practical. To better distribute computational load between edge devices and the cloud, a novel deep learning deployment strategy called Collaborative Intelligence (CI) has been proposed. In this strategy, features extracted from edge devices are first compressed and then transmitted to the cloud. However, it is unclear whether these compressed features have enough information to perform diverse downstream tasks. This paper focuses on the generalization of compressed features from one neural network among other object detection and instance segmentation task networks. We first propose a scheme to evaluate the generalization of features and further perform experiments on feature compression. Our experiments show that the extracted features contain enough information for other task networks and feature compression scheme for multi-task networks offers a 82.04% average bitrate saving compared to VVC.