Person re-identification (ReID) is an important task in computer vision. Most methods based on supervised strategies have achieved high performance. However, performance cannot be maintained when these methods are applied without labels because styles in different scenes exhibit considerable discrepancy. To address this problem, we propose an attention mutual teaching (AMT) network for unsupervised domain adaptation person ReID. The AMT method improves the performance of a model through iterative clustering and retraining. Meanwhile, two attention modules can teach each other to reduce clustering noise. We conduct extensive experiments on the Market-1501 and DukeMTMC-reID datasets. The experiments show that our approach performs better than state-of-the-art unsupervised methods.
As one of the most important tasks in computer vision, online object tracking plays a critical role in numerous lines of research, which has drawn a lot of researchers’ attention and be of many realistic applications. This paper develops a novel tracking algorithm based on the bag-of-local-patches representation with the discriminative learning scheme. In the first frame, a codebook is learned by applying the Kmeans algorithm to a set of densely sampled local patches of the tracked object, and then used to represent the template and candidate samples. During the tracking process, the similarities between the coding coefficients of the candidates and template are chosen as the likelihood values of these candidates. In addition, we propose effective model updating and discriminative learning schemes to capture the appearance change of the tracked object and incorporate the discriminative information to achieve a robust matching. Both qualitative and quantitative evaluations on some challenging image sequences demonstrate that the proposed tracker performs better than other state-of-the-art tracking methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.