Aiming at the problems of poor classification and recognition of traditional clothing images and low recognition accuracy, this paper proposes a convolutional neural network clothing material classification and recognition method based on the improved deep residual network (ResNet). By introducing an attention mechanism module in each residual block, a clothing material image classification and recognition model based on improved ResNet is constructed. In order to better carry out the related research, a self-constructed dataset specifically used for clothing material classification and recognition. In order to verify the superiority of the improved network in clothing material recognition, the traditional machine learning methods and the classical neural networks in deep learning are explored respectively, and the parameters of the network are adjusted to make it more suitable for clothing material recognition. The improved model is compared with other models. The experimental results show that the performance of the clothing material recognition network based on the improved ResNet network is superior (99.2% recognition accuracy, 98.46%precision, 98.83%recall, and 0.9864 F1_Score), and it can satisfy the needs of commercial networks for recognizing clothing materials
Clothing pattern recognition on social media is a key application of Internet marketing, but it is currently implemented manually, which is very inefficient. Our goal is to solve this problem through artificial intelligence. Based on the improved Mask RCNN network, this paper introduces the attention mechanism SENet to enable the feature extractor to extract the target areas that need to be focused on. And put more weight on this part to highlight significant useful features. And this article contributes a whole new dataset of clothing versions. The comparison experiment verifies that the improved Mask RCNN network has been significantly improved in the pattern recognition of clothing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.