• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhai, H. (Zhai, H..) | Yan, H.-Y. (Yan, H.-Y..) | Zhou, J.-Y. (Zhou, J.-Y..) | Liu, J. (Liu, J..) | Xie, Q.-W. (Xie, Q.-W..) | Shen, L.-J. (Shen, L.-J..) | Chen, X. (Chen, X..) | Han, H. (Han, H..)

Indexed by:

Scopus SCIE

Abstract:

近年来,领域内提出了许多能够量化动物行为的方法;这些方法不仅开辟了计算行为学领域,还使得建立通用的全自动行为分析方法成为可能。现有的前沿多动物姿态估计方法采用自下而上或自上而下的流程进行基于检测的跟踪,需要针对不同的动物外观重新训练模型。我们新提出的InteBOMB方法将通用对象跟踪集成到自上而下的流程中,无需学习特定动物的先验知识,在泛化性上有很强的比较优势。具体来说,我们的主要贡献包括两种适用于实验室场景的跟踪和分割策略以及两种适用于自然场景的姿态估计技巧。“背景增强”策略计算前景-背景对比损失,构建更具判别性的相关图。“在线校验”策略存储人在环路的长期记忆和模型动态的短期记忆,主动更新物体的视觉特征。“自动标注建议”技巧复用了跟踪期间保存的视觉特征,选择代表性的帧来标注,生成更好的训练集。“联合行为分析”技巧还将这些特征与其他模态的数据相结合,扩展了行为分类和聚类的潜在空间。值得注意的是,我们收集了六组小鼠数据集和六组非人灵长类动物数据集来对实验室和自然场景进行基准测试,评估了我们的零样本通用跟踪器和高性能联合潜在空间,其跨数据集的平均改进分别有 24% 和 21%。.; Advancements in animal behavior quantification methods have driven the development of computational ethology, enabling fully automated behavior analysis. Existing multi-animal pose estimation workflows rely on tracking-by-detection frameworks for either bottom-up or top-down approaches, requiring retraining to accommodate diverse animal appearances. This study introduces InteBOMB, an integrated workflow that enhances top-down approaches by incorporating generic object tracking, eliminating the need for prior knowledge of target animals while maintaining broad generalizability. InteBOMB includes two key strategies for tracking and segmentation in laboratory environments and two techniques for pose estimation in natural settings. The "background enhancement" strategy optimizes foreground-background contrastive loss, generating more discriminative correlation maps. The "online proofreading" strategy stores human-in-the-loop long-term memory and dynamic short-term memory, enabling adaptive updates to object visual features. The "automated labeling suggestion" technique reuses the visual features saved during tracking to identify representative frames for training set labeling. Additionally, the "joint behavior analysis" technique integrates these features with multimodal data, expanding the latent space for behavior classification and clustering. To evaluate the framework, six datasets of mice and six datasets of non-human primates were compiled, covering laboratory and natural scenes. Benchmarking results demonstrated a 24% improvement in zero-shot generic tracking and a 21% enhancement in joint latent space performance across datasets, highlighting the effectiveness of this approach in robust, generalizable behavior analysis.

Keyword:

Online learning Joint latent space Behavior analysis Pose estimation Generic object tracking Background subtraction Selective labeling

Author Community:

  • [ 1 ] [Zhai H.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, 100190, Beijing, China
  • [ 2 ] [Zhai H.]School of Future Technology, School of Artificial Intelligence, University of Chinese Academy of Sciences, 101408, Beijing, China
  • [ 3 ] [Yan H.-Y.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, 100190, Beijing, China
  • [ 4 ] [Yan H.-Y.]School of Future Technology, School of Artificial Intelligence, University of Chinese Academy of Sciences, 101408, Beijing, China
  • [ 5 ] [Zhou J.-Y.]School of Electronic Information and Electrical Engineering, Shanghai Jiao Tong University, 200240, Shanghai, China
  • [ 6 ] [Liu J.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, 100190, Beijing, China
  • [ 7 ] [Xie Q.-W.]Research Base of Beijing Modern Manufacturing Development, Beijing University of Technology, 100124, Beijing, China
  • [ 8 ] [Shen L.-J.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, 100190, Beijing, China
  • [ 9 ] [Chen X.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, China. E-mail:, 100190, Beijing, China
  • [ 10 ] [Han H.]Key Laboratory of Brain Cognition and Brain-inspired Intelligence Technology, Institute of Automation, Chinese Academy of Sciences, 100190, Beijing, China
  • [ 11 ] [Han H.]School of Future Technology, School of Artificial Intelligence, University of Chinese Academy of Sciences, China. E-mail:, 101408, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Zoological research

ISSN: 2095-8137

Year: 2025

Issue: 2

Volume: 46

Page: 355-369

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 11

Affiliated Colleges:

Online/Total:668/10645230
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.