FARN: Fetal Anatomy Reasoning Network for Detection With Global Context Semantic and Local Topology Relationship

被引:5
作者
Zhao, Lei [1 ]
Tan, Guanghua [1 ]
Wu, Qianghui [1 ]
Pu, Bin [2 ]
Ren, Hongliang [3 ]
Li, Shengli [4 ]
Li, Kenli [1 ]
机构
[1] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410082, Peoples R China
[2] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong 999077, Peoples R China
[3] Chinese Univ Hong Kong, Dept Elect Engn, Hong Kong 999077, Peoples R China
[4] Shenzhen Maternal & Child Healthcare Hosp, Dept Ultrasound, Shenzhen 518028, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Visualization; Feature extraction; Anatomy; Anatomical structure; Topology; Semantics; Cognition; Adaptive graph convolution; context-guided detector; fetal ultrasound images; multi-anatomy detection;
D O I
10.1109/JBHI.2024.3392531
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accurate recognition of fetal anatomical structure is a pivotal task in ultrasound (US) image analysis. Sonographers naturally apply anatomical knowledge and clinical expertise to recognizing key anatomical structures in complex US images. However, mainstream object detection approaches usually treat each structure recognition separately, overlooking anatomical correlations between different structures in fetal US planes. In this work, we propose a Fetal Anatomy Reasoning Network (FARN) that incorporates two kinds of relationship forms: a global context semantic block summarized with visual similarity and a local topology relationship block depicting structural pair constraints. Specifically, by designing the Adaptive Relation Graph Reasoning (ARGR) module, anatomical structures are treated as nodes, with two kinds of relationships between nodes modeled as edges. The flexibility of the model is enhanced by constructing the adaptive relationship graph in a data-driven way, enabling adaptation to various data samples without the need for predefined additional constraints. The feature representation is further refined by aggregating the outputs of the ARGR module. Comprehensive experimental results demonstrate that FARN achieves promising performance in detecting 37 anatomical structures across key US planes in tertiary obstetric screening. FARN effectively utilizes key relationships to improve detection performance, demonstrates robustness to small-scale, similar, and indistinct structures, and avoids some detection errors that deviate from anatomical norms. Overall, our study serves as a resource for developing efficient and concise approaches to model inter-anatomy relationships.
引用
收藏
页码:4866 / 4877
页数:12
相关论文
共 41 条
  • [41] Hybrid-MambaCD: Hybrid Mamba-CNN Network for Remote Sensing Image Change Detection With Region-Channel Attention Mechanism and Iterative Global-Local Feature Fusion
    Feng, Yuxin
    Zhuo, Li
    Zhang, Hui
    Li, Jiafeng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63