Machine learning algorithms traditionally rely on large datasets for high accuracy. However, advances in the field are now enabling the exploration of solutions in niche engineering areas with smaller datasets. This article reviews the challenges and solutions in working with small datasets, particularly in optoelectronics and biomedical engineering. In optoelectronics, small datasets are key for designing and validating photonic systems, as experiments with living tissues can be costly and complex. The article discusses optimizing photonic response simulations and system calibration using machine learning models that are effective with smaller datasets. In biomedical engineering, the focus is on 3D-printed tissue phantoms, which mimic living tissue properties for non-invasive validation of photonic devices in diagnostics. The study explores how small data techniques like transfer learning, bootstrapping, regularization, and K-fold cross-validation can improve interpretations from small datasets, enhance predictive capabilities, and address data scarcity issues.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.