Automatic breast ultrasound (ABUS) imaging is a well-established tool in breast cancer diagnosis. Delineating lesion on ABUS is an essential step for breast cancer computer-aided diagnosis (CAD). This work aims to develop an automated deep learning-based method for breast tumor segmentation on three-dimensional ABUS. The proposed method, one-stage hierarchical target activation network, consists of three subnetworks, i.e., fully convolutional one-state object detector (FCOS), hierarchical block and mask module. Feature extractor is used to extract informative features from ABUS. FCOS is used to locate the volume-of-interest (VOIs) of breast tumor. Hierarchical block is used to enhance the feature contrast around tumor boundary. Mask module then segment tumor from the refined feature map within the VOIs. A five-fold cross-validation on 40 patients’ cases was conducted. The ABUS breast tumors were segmented and compared with manual contours using several segmentation measurements. The Dice similarity coefficient (DSC) and mean surface distance (HD95) is 0.855±0.090 and 1.56±1.02 mm, respectively. These results demonstrate the feasibility and efficacy of the proposed method for breast tumor segmentation, which can further facilitate CAD for breast cancer using 3D ABUS imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.