Traditionally, material surface classification has relied on reflectance spectrum measurement and pixel-wise comparison. For best results, the measurement usually requires measurement of the full spectral reflectance which can be the time-consuming issue and error-prone. Our previous works have proved that convolutional neural networks could learn the Bi-directional Texture Function (BTF) data feature hierarchy from pixels to the classifier. In this paper, we trained a wide-dense neural network for BTF data. The dense neural network was developed from the residual network (ResNet), which made a residual mapping for these stacked layers instead of directly fitting a desired underlying mapping. This shortcut idea has excellent performance in deeper layers. The dense structure reduced the redundancy within the feature maps of the individual layers and increased training speed. The narrow size of the dense net layers simplified the feature map for the whole network. The wide structure of the network was applied due to the resolution of our BTF data. In this paper, we generated BTF angular maps for material classification. Our data consisted of 151 lighting and 151 viewing directions. This angular resolution is higher than input data of conventional pre-trained dense network model. By adding a wide net structure, training time is reduced and classification performance improved. Training our own networks for specific data requires a large training dataset, therefore we augmented angular BTF image to improve the robustness of the training model. Within the over 30,000 BTF angular maps, we obtained a more reliable training model. Finally, we compared the improvement of our wide-dense network with other pre-trained neural networks and other feature extraction network.
Traditional reflectance spectrum classification algorithms are based on comparing spectrum across the electromagnetic spectrum anywhere from the ultra-violet to the thermal infrared regions. These methods analyze reflectance on a pixel by pixel basis. Inspired by high performance that Convolution Neural Networks (CNN) have demonstrated in image classification, we applied a neural network to analyze directional reflectance pattern images. By using the bidirectional reflectance distribution function (BRDF) data, we can reformulate the 4-dimensional into 2 dimensions, namely incident direction × reflected direction × channels. Meanwhile, RIT’s micro-DIRSIG model is utilized to simulate additional training samples for improving the robustness of the neural networks training. Unlike traditional classification by using hand-designed feature extraction with a trainable classifier, neural networks create several layers to learn a feature hierarchy from pixels to classifier and all layers are trained jointly. Hence, the our approach of utilizing the angular features are different to traditional methods utilizing spatial features. Although training processing typically has a large computational cost, simple classifiers work well when subsequently using neural network generated features. Currently, most popular neural networks such as VGG, GoogLeNet and AlexNet are trained based on RGB spatial image data. Our approach aims to build a directional reflectance spectrum based neural network to help us to understand from another perspective. At the end of this paper, we compare the difference among several classifiers and analyze the trade-off among neural networks parameters.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.