Search Results for author: Atsushi Irie

Found 5 papers, 0 papers with code

Extreme Compression of Adaptive Neural Images

no code implementations27 May 2024 Leo Hoshikawa, Marcos V. Conde, Takeshi Ohashi, Atsushi Irie

The fundamental idea is to represent a signal as a continuous and differentiable neural network.

Mixed-precision Supernet Training from Vision Foundation Models using Low Rank Adapter

no code implementations29 Mar 2024 Yuiko Sakuma, Masakazu Yoshimura, Junji Otsuka, Atsushi Irie, Takeshi Ohashi

To tackle these challenges, first, we study the effective search space design for fine-tuning a VFM by comparing different operators (such as resolution, feature size, width, depth, and bit-widths) in terms of performance and BitOPs reduction.

Neural Architecture Search

Efficient Joint Detection and Multiple Object Tracking with Spatially Aware Transformer

no code implementations9 Nov 2022 Siddharth Sagar Nijhawan, Leo Hoshikawa, Atsushi Irie, Masakazu Yoshimura, Junji Otsuka, Takeshi Ohashi

We propose a light-weight and highly efficient Joint Detection and Tracking pipeline for the task of Multi-Object Tracking using a fully-transformer architecture.

Multi-Object Tracking Multiple Object Tracking +1

DynamicISP: Dynamically Controlled Image Signal Processor for Image Recognition

no code implementations ICCV 2023 Masakazu Yoshimura, Junji Otsuka, Atsushi Irie, Takeshi Ohashi

Image Signal Processors (ISPs) play important roles in image recognition tasks as well as in the perceptual quality of captured images.

object-detection Object Detection

Rawgment: Noise-Accounted RAW Augmentation Enables Recognition in a Wide Variety of Environments

no code implementations CVPR 2023 Masakazu Yoshimura, Junji Otsuka, Atsushi Irie, Takeshi Ohashi

We show that our proposed noise-accounted RAW augmentation method doubles the image recognition accuracy in challenging environments only with simple training data.

Image Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.