RMIT University
Browse

Unity Style Transfer for Person Re-Identification

conference contribution
posted on 2024-11-03, 14:38 authored by Chong Liu, Xiaojun ChangXiaojun Chang, Yi-Dong Shen
Style variation has been a major challenge for person re-identification, which aims to match the same pedestrians across different cameras. Existing works attempted to address this problem with camera-invariant descriptor subspace learning. However, there will be more image artifacts when the difference between the images taken by different cameras is larger. To solve this problem, we propose a UnityStyle adaption method, which can smooth the style disparities within the same camera and across different cameras. Specifically, we firstly create UnityGAN to learn the style changes between cameras, producing shape-stable style-unity images for each camera, which is called UnityStyle images. Meanwhile, we use UnityStyle images to eliminate style differences between different images, which makes a better match between query and gallery. Then, we apply the proposed method to Re-ID models, expecting to obtain more style-robust depth features for querying. We conduct extensive experiments on widely used benchmark datasets to evaluate the performance of the proposed framework, the results of which confirm the superiority of the proposed model.

Funding

Towards data-efficient future action prediction in the wild

Australian Research Council

Find out more...

History

Start page

6886

End page

6895

Total pages

10

Outlet

Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR 2020)

Name of conference

CVPR 2020

Publisher

IEEE

Place published

United States

Start date

2020-06-14

End date

2020-06-19

Language

English

Copyright

© 2020 IEEE.

Former Identifier

2006109337

Esploro creation date

2021-08-28

Usage metrics

    Scholarly Works

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC