PDF Intelligence Science and Big Data Engineering Visual Data Engineering 9Th International Conference Iscide 2019 Nanjing China October 17 20 2019 Proceedings Part I Zhen Cui Ebook Full Chapter

Download as pdf or txt
Download as pdf or txt
You are on page 1of 54

Intelligence Science and Big Data

Engineering Visual Data Engineering


9th International Conference IScIDE
2019 Nanjing China October 17 20 2019
Proceedings Part I Zhen Cui
Visit to download the full and correct content document:
https://textbookfull.com/product/intelligence-science-and-big-data-engineering-visual-
data-engineering-9th-international-conference-iscide-2019-nanjing-china-october-17-
20-2019-proceedings-part-i-zhen-cui/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Intelligence Science and Big Data Engineering Big Data


and Machine Learning 9th International Conference
IScIDE 2019 Nanjing China October 17 20 2019
Proceedings Part II Zhen Cui
https://textbookfull.com/product/intelligence-science-and-big-
data-engineering-big-data-and-machine-learning-9th-international-
conference-iscide-2019-nanjing-china-
october-17-20-2019-proceedings-part-ii-zhen-cui/

Data Science 5th International Conference of Pioneering


Computer Scientists Engineers and Educators ICPCSEE
2019 Guilin China September 20 23 2019 Proceedings Part
I Xiaohui Cheng
https://textbookfull.com/product/data-science-5th-international-
conference-of-pioneering-computer-scientists-engineers-and-
educators-icpcsee-2019-guilin-china-
september-20-23-2019-proceedings-part-i-xiaohui-cheng/

Intelligence Science and Big Data Engineering 8th


International Conference IScIDE 2018 Lanzhou China
August 18 19 2018 Revised Selected Papers Yuxin Peng

https://textbookfull.com/product/intelligence-science-and-big-
data-engineering-8th-international-conference-
iscide-2018-lanzhou-china-august-18-19-2018-revised-selected-
papers-yuxin-peng/

ICT Innovations 2019 Big Data Processing and Mining


11th International Conference ICT Innovations 2019
Ohrid North Macedonia October 17 19 2019 Proceedings
Sonja Gievska
https://textbookfull.com/product/ict-innovations-2019-big-data-
processing-and-mining-11th-international-conference-ict-
innovations-2019-ohrid-north-macedonia-
Intelligent Data Engineering and Automated Learning
IDEAL 2019 20th International Conference Manchester UK
November 14 16 2019 Proceedings Part I Hujun Yin

https://textbookfull.com/product/intelligent-data-engineering-
and-automated-learning-ideal-2019-20th-international-conference-
manchester-uk-november-14-16-2019-proceedings-part-i-hujun-yin/

Advanced Hybrid Information Processing Third EAI


International Conference ADHIP 2019 Nanjing China
September 21 22 2019 Proceedings Part I Guan Gui

https://textbookfull.com/product/advanced-hybrid-information-
processing-third-eai-international-conference-adhip-2019-nanjing-
china-september-21-22-2019-proceedings-part-i-guan-gui/

Advances in Knowledge Discovery and Data Mining 23rd


Pacific Asia Conference PAKDD 2019 Macau China April 14
17 2019 Proceedings Part I Qiang Yang

https://textbookfull.com/product/advances-in-knowledge-discovery-
and-data-mining-23rd-pacific-asia-conference-pakdd-2019-macau-
china-april-14-17-2019-proceedings-part-i-qiang-yang/

Knowledge Science Engineering and Management 12th


International Conference KSEM 2019 Athens Greece August
28 30 2019 Proceedings Part I Christos Douligeris

https://textbookfull.com/product/knowledge-science-engineering-
and-management-12th-international-conference-ksem-2019-athens-
greece-august-28-30-2019-proceedings-part-i-christos-douligeris/

Intelligent Data Engineering and Automated Learning


IDEAL 2019 20th International Conference Manchester UK
November 14 16 2019 Proceedings Part II Hujun Yin

https://textbookfull.com/product/intelligent-data-engineering-
and-automated-learning-ideal-2019-20th-international-conference-
manchester-uk-november-14-16-2019-proceedings-part-ii-hujun-yin/
Zhen Cui · Jinshan Pan · Shanshan Zhang ·
Liang Xiao · Jian Yang (Eds.)

Intelligence Science
LNCS 11935

and Big Data Engineering


Visual Data Engineering
9th International Conference, IScIDE 2019
Nanjing, China, October 17–20, 2019
Proceedings, Part I
Lecture Notes in Computer Science 11935

Founding Editors
Gerhard Goos
Karlsruhe Institute of Technology, Karlsruhe, Germany
Juris Hartmanis
Cornell University, Ithaca, NY, USA

Editorial Board Members


Elisa Bertino
Purdue University, West Lafayette, IN, USA
Wen Gao
Peking University, Beijing, China
Bernhard Steffen
TU Dortmund University, Dortmund, Germany
Gerhard Woeginger
RWTH Aachen, Aachen, Germany
Moti Yung
Columbia University, New York, NY, USA
More information about this series at http://www.springer.com/series/7412
Zhen Cui Jinshan Pan Shanshan Zhang
• • •

Liang Xiao Jian Yang (Eds.)


Intelligence Science
and Big Data Engineering
Visual Data Engineering
9th International Conference, IScIDE 2019
Nanjing, China, October 17–20, 2019
Proceedings, Part I

123
Editors
Zhen Cui Jinshan Pan
Nanjing University of Science Nanjing University of Science
and Technology and Technology
Nanjing, China Nanjing, China
Shanshan Zhang Liang Xiao
Nanjing University of Science Nanjing University of Science
and Technology and Technology
Nanjing, China Nanjing, China
Jian Yang
Nanjing University of Science
and Technology
Nanjing, China

ISSN 0302-9743 ISSN 1611-3349 (electronic)


Lecture Notes in Computer Science
ISBN 978-3-030-36188-4 ISBN 978-3-030-36189-1 (eBook)
https://doi.org/10.1007/978-3-030-36189-1
LNCS Sublibrary: SL6 – Image Processing, Computer Vision, Pattern Recognition, and Graphics

© Springer Nature Switzerland AG 2019


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are
believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors
give a warranty, expressed or implied, with respect to the material contained herein or for any errors or
omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in
published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

The International Conference on Intelligence Science and Big Data Engineering


(IScIDE 2019), took place in Nanjing, China, during October 17–20, 2019. As one
of the annual events organized by the Chinese Golden Triangle ISIS (Information
Science and Intelligence Science) Forum, this meeting was scheduled as the 9th in a
series of annual meetings promoting the academic exchange of research on various
areas of intelligence science and big data engineering in China and abroad.
We received a total of 225 submissions, each of which was reviewed by at least
3 reviewers. Finally, 84 papers were accepted for presentation at the conference, with
an acceptance rate of 37.33%. Among the accepted papers, 14 were selected for oral
presentations, 35 for spotlight presentations, and 35 for poster presentations. We would
like to thank all the reviewers for spending their precious time on reviewing the papers
and for providing valuable comments that helped significantly in the paper selection
process. We also included an invited paper in the proceedings entitled “Deep IA-BI and
Five Actions in Circling” by Prof. Lei Xu.
We are grateful to the conference general co-chairs, Lei Xu, Xinbo Gao and Jian
Yang, for their leadership, advice, and help on crucial matters concerning the
conference. We would like to thank all members of the Steering Committee, Program
Committee, Organizing Committee, and Publication Committee for their hard work.
We give special thanks to Prof. Xu Zhang, Prof. Steve S. Chen, Prof. Lei Xu,
Prof. Ming-Hsuan Yang, Prof. Masashi Sugiyama, Prof. Jingyi Yu, Prof. Dong Xu, and
Prof. Kun Zhang for delivering the keynote speeches. We would also like to thank
Prof. Lei Xu for contributing a high-quality invited paper. Finally, we greatly appre-
ciate all the authors’ contributions to the high quality of this conference. We count on
your continued support of the ISIS community in the future.

October 2019 Zhen Cui


Jinshan Pan
Shanshan Zhang
Liang Xiao
Jian Yang
Organization

General Chairs
Lei Xu Shanghai Jiao Tong University, China
Xinbo Gao Xidian University, China
Jian Yang Nanjing University of Science and Technology, China

Program Chairs
Huafu Chen University of Electronic Science and Technology
of China, China
Zhouchen Lin Peking University, China
Kun Zhang Carnegie Mellon University, USA
Zhen Cui Nanjing University of Science and Technology, China

Organization Chairs
Liang Xiao Nanjing University of Science and Technology, China
Chen Gong Nanjing University of Science and Technology, China

Special Issue Chairs


Mingming Cheng Nankai University, China
Jinshan Pan Nanjing University of Science and Technology, China

Publication Chairs
Shanshan Zhang Nanjing University of Science and Technology, China
Wankou Yang Southeast University, China

Program Committee
Mingming Gong The University of Melbourne, Australia
Joseph Ramsey Carnegie Mellon University, USA
Biwei Huang Carnegie Mellon University, USA
Daniel Malinsky Johns Hopkins University, USA
Ruben Sanchez-Romero Rutgers University, USA
Shohei Shimizu RIKEN, Japan
Ruichu Cai Guangdong University of Technology, China
Shuigeng Zhou Fudan University, China
Changdong Wang Sun Yat-sen University, China
Tao Lei Shaanxi University of Science & Technology, China
viii Organization

Xianye Ben Shandong University, China


Jorma Rissanen Emeritus of Tampere University of Technology,
Finland
Alan L. Yuille Johns Hopkins University, USA
Andrey S. Krylov Lomonosov Moscow State University, Russia
Jinbo Xu Toyota Technological Institute at Chicago,
University of Chicago, USA
Nathan Srebro Toyota Technological Institute at Chicago,
University of Chicago, USA
Raquel Urtasun Uber ATG Toronto, Canada
Hava T. Siegelmann University of Massachusetts Amherst, USA
Jurgen Schmidhuber European Academy of Sciences and Arts, Austria
Sayan Mukherjee Duke University, USA
Vincent Tseng National Cheng Kung University, Taiwan
Alessandro Giua University of Cagliari, Italy
Shu-Heng Chen National Chengchi University, Taiwan
Seungjin Choi Pohang University of Science and Tech, South Korea
Kenji Fukumizu The Institute of Statistical Mathematics, Japan
Kalviainen Heikki Lappeenranta University of Technology, Finland
Akira Hirose The University of Tokyo, Japan
Tu Bao Ho JAIST, Japan
Derek Hoiem University of Illinois at Urbana-Champaign, USA
Ikeda Kazushi NARA Institute of Science and Technology, Japan
Seiichi Ozawa Kobe University, Japan
Yishi Wang UNC Wilmington, USA
Cuixian Chen UNC Wilmington, USA
Karl Ricanek UNC Wilmington, USA
Hichem Sahli Vrije Universiteit Brussel, Belgium
Fiori Simone Universita Politecnica delle Marche, Italy
Cox Stephen The Australian National University, Australia
Vincent Tseng Cheng Kung University, Taiwan
Qiang Yang Hong Kong University of Science and Technology,
Hong Kong, China
Chengjun Liu New Jersey Institute of Technology, USA
Shuicheng Yan National University of Singapore, Singapore
Jieping Ye University of Michigan, USA
Wai-Kiang Yeap Auckland University of Technology, New Zealand
Hujun Yin The University of Manchester, UK
Lei Zhang Hong Kong Polytechnic University, Hong Kong, China
Qinfeng Shi University of Adelaide, Australia
Wanli Ouyang The University of Sydney, Australia
Yida Xu University of Technology, Sydney, Australia
Hongyan Wang Dalian University of Technology, USA
Yazhou Yao University of Technology, Sydney, Australia
Xiaoning Song Jiangnan University, China
Yong Xia Northwestern Polytechnical University, China
Organization ix

Lei Zhang Chongqing University, China


Tao Wang Nanjing University of Science and Technology, China
Changxing Ding South China University of Technology, China
Xin Liu Huaqiao University, China
Yang Liu Dalian University of Technology, USA
Ying Tai YouTu Lab, Tencent, China
Minqiang Yang Lanzhou University, China
Guangwei Gao Nanjing University of Posts and Telecommunications,
China
Shuzhe Wu Chinese Academy of Sciences, China
Youyong Kong Southeast University, China
Qiguang Miao Xidian University, China
Chang Xu The University of Sydney, Australia
Tengfei Song Southeast University, China
Xingpeng Jiang Central China Normal University, China
Wei-Shi Zheng Sun Yat-sen University, China
Yu Chen Motovis Inc., Australia
Zebin Wu Nanjing University of Science and Technology, China
Wei Luo South China Agricultural University, China
Minxian Li Nanjing University of Science and Technology, China
Ruiping Wang Chinese Academy of Sciences, China
Jia Liu Xidian University, China
Yang He Max Planck Institute for Informatics, Germany
Xiaobo Chen Jiangsu University, China
Xiangbo Shu Nanjing University of Science and Technology, China
Yun Gu Shanghai Jiao Tong University, China
Xin Geng Southeast University, China
Zheng Wang National Institute of Informatics, Japan
Lefei Zhang Wuhan University, China
Liping Xie Southeast University, China
Xiangyuan Lan Hong Kong Baptist University, Hong Kong, China
Xi Peng Agency for Science, Technology and Research
(A*STAR), Singapore
Yuxin Peng Peking University, China
Cheng Deng Xidian University, China
Dong Gong The University of Adelaide, Australia
Meina Kan Chinese Academy of Sciences, China
Hualong Yu Jiangsu University of Science and Technology, China
Kazushi Ikeda NARA Institute of Science and Technology, Japan
Meng Yang Sun Yat-Sen University, China
Ping Du Shandong Normal University, China
Jufeng Yang Nankai University, China
Andrey Krylov Lomonosov Moscow State University, Russia
Shun Zhang Northwestern Polytechnical University, China
Di Huang Beihang University, China
Shuaiqi Liu Tianjin Normal University, China
x Organization

Chun-Guang Li Beijing University of Posts and Telecommunications,


China
Huimin Ma Tsinghua University, China
Longyu Jiang Southeast University, China
Shikui Tu Shanghai Jiao Tong University, China
Lijun Wang Dalian University of Technology, USA
Xiao-Yuan Jing Wuhan University, China
Shiliang Sun East China Normal University, China
Zhenzhen Hu HeFei University of Technology, China
Ningzhong Liu NUAA, China
Hiroyuki Iida JAIST, Japan
Jinxia Zhang Southeast University, China
Ying Fu Beijing Institute of Technology, China
Tongliang Liu The University of Sydney, Australia
Weihong Deng Beijing University of Posts and Telecommunications,
China
Wen Zhang Wuhan University, China
Dong Wang Dalian University of Technology, USA
Hang Dong Xi’an Jiaotong University, China
Dongwei Ren Tianjin University, China
Xiaohe Wu Harbin Institute of Technology, China
Qianru Sun National University of Singapore, Singapore
Yunchao Wei University of Illinois at Urbana-Champaign, USA
Wenqi Ren Chinese Academy of Sciences, China
Wenda Zhao Dalian University of Technology, USA
Jiwen Lu Tsinghua University, China
Yukai Shi Sun Yat-sen University, China
Enmei Tu Shanghai Jiao Tong University, China
Yufeng Li Nanjing University, China
Qilong Wang Tianjin University, China
Baoyao Yang Hong Kong Baptist University, Hong Kong, China
Qiuhong Ke Max Planck Institute for Informatics, Germany
Guanyu Yang Southeast University, China
Jiale Cao Tianjin University, China
Zhuo Su Sun Yat-sen University, China
Zhao Zhang HeFei University of Technology, China
Hong Pan Southeast University, China
Hu Han Chinese Academy of Sciences, China
Hanjiang Lai Sun Yat-Sen University, China
Xin Li Harbin Institute of Technology, Shenzhen, China
Dingwen Zhang Northwestern Polytechnical University, China
Guo-Sen Xie Inception Institute of Artificial Intelligence, UAE
Xibei Yang Jiangsu University of Science and Technology, China
Wang Haixian Southeast University, China
Wangmeng Zuo Harbin Institute of Technology, China
Weiwei Liu University of Technology, Sydney, Australia
Organization xi

Shuhang Gu ETH Zurich, Switzerland


Hanli Wang Tongji University, China
Zequn Jie Tencent, China
Xiaobin Zhu University of Science and Technology Beijing, China
Gou Jin Huaqiao University, China
Junchi Yan Shanghai Jiao Tong University, China
Bineng Zhong Huaqiao University, China
Nannan Wang Xidian University, China
Bo Han RIKEN, Japan
Xiaopeng Hong Xi’an Jiaotong University, China
Yuchao Dai Northwestern Polytechnical University, China
Wenming Zheng Southeast University, China
Lixin Duan University of Science and Technology of China, China
Hu Zhu Nanjing University of Posts and Telecommunications,
China
Xiaojun Chang Carnegie Mellon University, USA
Contents – Part I

Deep IA-BI and Five Actions in Circling . . . . . . . . . . . . . . . . . . . . . . . . . . 1


Lei Xu

Adaptive Online Learning for Video Object Segmentation . . . . . . . . . . . . . . 22


Li Wei, Chunyan Xu, and Tong Zhang

Proposal-Aware Visual Saliency Detection with Semantic Attention . . . . . . . 35


Lu Wang, Tian Song, Takafumi Katayama, and Takashi Shimamoto

Constrainted Subspace Low-Rank Representation with Spatial-Spectral


Total Variation for Hyperspectral Image Restoration . . . . . . . . . . . . . . . . . . 46
Jun Ye and Xian Zhang

Memory Network-Based Quality Normalization of Magnetic Resonance


Images for Brain Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Yang Su, Jie Wei, Benteng Ma, Yong Xia, and Yanning Zhang

Egomotion Estimation Under Planar Motion with an RGB-D Camera . . . . . . 68


Xuelan Mu, Zhixin Hou, and Yigong Zhang

Sparse-Temporal Segment Network for Action Recognition . . . . . . . . . . . . . 80


Chaobo Li, Yupeng Ding, and Hongjun Li

SliceNet: Mask Guided Efficient Feature Augmentation


for Attention-Aware Person Re-Identification . . . . . . . . . . . . . . . . . . . . . . . 91
Zhipu Liu and Lei Zhang

Smoother Soft-NMS for Overlapping Object Detection in X-Ray Images . . . . 103


Chunhui Lin, Xudong Bao, and Xuan Zhou

Structure-Preserving Guided Image Filtering . . . . . . . . . . . . . . . . . . . . . . . . 114


Hongyan Wang, Zhixun Su, and Songxin Liang

Deep Blind Image Inpainting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128


Yang Liu, Jinshan Pan, and Zhixun Su

Robust Object Tracking Based on Multi-granularity Sparse Representation. . . 142


Honglin Chu, Jiajun Wen, and Zhihui Lai

A Bypass-Based U-Net for Medical Image Segmentation . . . . . . . . . . . . . . . 155


Kaixuan Chen, Gengxin Xu, Jiaying Qian, and Chuan-Xian Ren
xiv Contents – Part I

Real-Time Visual Object Tracking Based on Reinforcement Learning


with Twin Delayed Deep Deterministic Algorithm . . . . . . . . . . . . . . . . . . . 165
Shengjie Zheng and Huan Wang

Efficiently Handling Scale Variation for Pedestrian Detection . . . . . . . . . . . . 178


Qihua Cheng and Shanshan Zhang

Leukocyte Segmentation via End-to-End Learning of Deep Convolutional


Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Yan Lu, Haoyi Fan, and Zuoyong Li

Coupled Squeeze-and-Excitation Blocks Based CNN


for Image Compression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Jing Du, Yang Xu, and Zhihui Wei

Soft Transferring and Progressive Learning for Human


Action Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
Shenqiang Yuan, Xue Mei, Yi He, and Jin Zhang

Face Sketch Synthesis Based on Adaptive Similarity Regularization . . . . . . . 226


Songze Tang and Mingyue Qiu

Three-Dimensional Coronary Artery Centerline Extraction and Cross


Sectional Lumen Quantification from CT Angiography Images . . . . . . . . . . . 238
Hengfei Cui, Yong Xia, and Yanning Zhang

A Robust Facial Landmark Detector with Mixed Loss . . . . . . . . . . . . . . . . . 249


Xian Zhang, Xinjie Tong, Ziyu Li, and Wankou Yang

Object Guided Beam Steering Algorithm for Optical Phased


Array (OPA) LIDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
Zhiqing Wang, Zhiyu Xiang, and Eryun Liu

Channel Max Pooling for Image Classification . . . . . . . . . . . . . . . . . . . . . . 273


Lu Cheng, Dongliang Chang, Jiyang Xie, Rongliang Ma,
Chunsheng Wu, and Zhanyu Ma

A Multi-resolution Coarse-to-Fine Segmentation Framework with Active


Learning in 3D Brain MRI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
Zhenxi Zhang, Jie Li, Zhusi Zhong, Zhicheng Jiao, and Xinbo Gao

Deep 3D Facial Landmark Detection on Position Maps . . . . . . . . . . . . . . . . 299


Kangkang Gao, Shanming Yang, Keren Fu, and Peng Cheng

Joint Object Detection and Depth Estimation in Multiplexed Image. . . . . . . . 312


Changxin Zhou and Yazhou Liu

Weakly-Supervised Semantic Segmentation with Mean Teacher Learning. . . . 324


Li Tan, WenFeng Luo, and Meng Yang
Contents – Part I xv

APAC-Net: Unsupervised Learning of Depth and Ego-Motion


from Monocular Video . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336
Rui Lin, Yao Lu, and Guangming Lu

Robust Image Recovery via Mask Matrix. . . . . . . . . . . . . . . . . . . . . . . . . . 349


Mengying Jin and Yunjie Chen

Multiple Objects Tracking Based Vehicle Speed Analysis with Gaussian


Filter from Drone Video . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
Yue Liu, Zhichao Lian, Junjie Ding, and Tangyi Guo

A Novel Small Vehicle Detection Method Based on UAV Using Scale


Adaptive Gradient Adjustment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
Changju Feng and Zhichao Lian

A Level Set Method for Natural Image Segmentation by Texture


and High Order Edge-Detector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
Yutao Yao, Ziguan Cui, and Feng Liu

An Attention Bi-box Regression Network for Traffic Light Detection . . . . . . 399


Juncai Ma, Yao Zhao, Ming Luo, Xiang Jiang, Ting Liu, and Shikui Wei

MGD: Mask Guided De-occlusion Framework for Occluded


Person Re-identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
Peixi Zhang, Jianhuang Lai, Quan Zhang, and Xiaohua Xie

Multi-scale Residual Dense Block for Video Super-Resolution . . . . . . . . . . . 424


Hetao Cui and Quansen Sun

Visual Saliency Guided Deep Fabric Defect Classification . . . . . . . . . . . . . . 435


Yonggui He, Yaoye Song, Jifeng Shen, and Wankou Yang

Locality and Sparsity Preserving Embedding Convolutional


Neural Network for Image Classification . . . . . . . . . . . . . . . . . . . . . . . . . . 447
Yu Xia and Yongzhao Zhan

Person Re-identification Using Group Constraint. . . . . . . . . . . . . . . . . . . . . 459


Ling Mei, Jianhuang Lai, Zhanxiang Feng, Zeyu Chen, and Xiaohua Xie

A Hierarchical Student’s t-Distributions Based Unsupervised SAR Image


Segmentation Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 472
Yuhui Zheng, Yahui Sun, Le Sun, Hui Zhang, and Byeungwoo Jeon

Multi-branch Semantic GAN for Infrared Image Generation


from Optical Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484
Lei Li, Pengfei Li, Meng Yang, and Shibo Gao

Semantic Segmentation for Prohibited Items in Baggage Inspection . . . . . . . . 495


Jiuyuan An, Haigang Zhang, Yue Zhu, and Jinfeng Yang
xvi Contents – Part I

Sparse Unmixing for Hyperspectral Image with Nonlocal Low-Rank Prior . . . 506
Feiyang Wu, Yuhui Zheng, and Le Sun

Saliency Optimization Integrated Robust Background Detection


with Global Ranking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 517
Zipeng Zhang, Yixiao Liang, Jian Zheng, Kai Li, Zhuanlian Ding,
and Dengdi Sun

Improvement of Residual Attention Network for Image Classification . . . . . . 529


Lu Liang, Jiangdong Cao, Xiaoyan Li, and Jane You

Nuclei Perception Network for Pathology Image Analysis . . . . . . . . . . . . . . 540


Haojun Xu, Yan Gao, Liucheng Hu, Jie Li, and Xinbo Gao

A k-Dense-UNet for Biomedical Image Segmentation . . . . . . . . . . . . . . . . . 552


Zhiwen Qiang, Shikui Tu, and Lei Xu

Gated Fusion of Discriminant Features for Caricature Recognition . . . . . . . . 563


Lingna Dai, Fei Gao, Rongsheng Li, Jiachen Yu, Xiaoyuan Shen,
Huilin Xiong, and Weilun Wu

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 575


Contents – Part II

Analysis of WLAN’s Receiving Signal Strength Indication


for Indoor Positioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Minmin Lin, Zhisen Wei, Baoxing Chen, Wenjie Zhang,
and Jingmin Yang

Computational Decomposition of Style for Controllable and Enhanced


Style Transfer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Minchao Li, Shikui Tu, and Lei Xu

Laplacian Welsch Regularization for Robust Semi-supervised


Dictionary Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Jingchen Ke, Chen Gong, and Lin Zhao

Non-local MMDenseNet with Cross-Band Features for Audio


Source Separation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Yi Huang

A New Method of Metaphor Recognition for A-is-B Model


in Chinese Sentences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Wei-min Wang, Rong-rong Gu, Shou-fu Fu, and Dong-sheng Wang

Layerwise Recurrent Autoencoder for Real-World Traffic


Flow Forecasting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Junhui Zhao, Tianqi Zhu, Ruidong Zhao, and Peize Zhao

Mining Meta-association Rules for Different Types of Traffic Accidents . . . . 89


Ziyu Zhao, Weili Zeng, Zhengfeng Xu, and Zhao Yang

Reliable Domain Adaptation with Classifiers Competition . . . . . . . . . . . . . . 101


Jingru Fu and Lei Zhang

An End-to-End LSTM-MDN Network for Projectile Trajectory Prediction . . . 114


Li-he Hou and Hua-jun Liu

DeepTF: Accurate Prediction of Transcription Factor Binding Sites


by Combining Multi-scale Convolution and Long Short-Term Memory
Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
Xiao-Rong Bao, Yi-Heng Zhu, and Dong-Jun Yu

Epileptic Seizure Prediction Based on Convolutional Recurrent Neural


Network with Multi-Timescale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
Lijuan Duan, Jinze Hou, Yuanhua Qiao, and Jun Miao
xviii Contents – Part II

L2R-QA: An Open-Domain Question Answering Framework . . . . . . . . . . . . 151


Tieke He, Yu Li, Zhipeng Zou, and Qing Wu

Attention Relational Network for Few-Shot Learning. . . . . . . . . . . . . . . . . . 163


Jia Shuai, JiaMing Chen, and Meng Yang

Syntactic Analysis of Power Grid Emergency Pre-plans Based


on Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
He Shi, Qun Yang, Bo Wang, Shaohan Liu, and Kai Zhou

Improved CTC-Attention Based End-to-End Speech Recognition on Air


Traffic Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Kai Zhou, Qun Yang, XiuSong Sun, ShaoHan Liu, and JinJun Lu

Revisit Lmser from a Deep Learning Perspective . . . . . . . . . . . . . . . . . . . . 197


Wenjin Huang, Shikui Tu, and Lei Xu

A New Network Traffic Identification Base on Deep


Factorization Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
Zhenxing Xu, Junyi Zhang, Daoqiang Zhang, and Hanyu Wei

3Q: A 3-Layer Semantic Analysis Model for Question Suite Reduction . . . . . 219
Wei Dai, Siyuan Sheni, and Tieke Hei

Data Augmentation for Deep Learning of Judgment Documents . . . . . . . . . . 232


Ge Yan, Yu Li, Shu Zhang, and Zhenyu Chen

An Advanced Least Squares Twin Multi-class Classification Support


Vector Machine for Few-Shot Classification . . . . . . . . . . . . . . . . . . . . . . . . 243
Yu Li, Zhonggeng Liu, Huadong Pan, Jun Yin, and Xingming Zhang

LLN-SLAM: A Lightweight Learning Network Semantic SLAM . . . . . . . . . 253


Xichao Qu and Weiqing Li

Meta-cluster Based Consensus Clustering with Local Weighting


and Random Walking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Nannan He and Dong Huang

Robust Nonnegative Matrix Factorization Based on Cosine Similarity


Induced Metric . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
Wen-Sheng Chen, Haitao Chen, Binbin Pan, and Bo Chen

Intellectual Property in Colombian Museums: An Application


of Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289
Jenny Paola Lis-Gutiérrez, Álvaro Zerda Sarmiento, and Amelec Viloria

Hybrid Matrix Factorization for Multi-view Clustering. . . . . . . . . . . . . . . . . 302


Hongbin Yu and Xin Shu
Contents – Part II xix

Car Sales Prediction Using Gated Recurrent Units Neural Networks


with Reinforcement Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Bowen Zhu, Huailong Dong, and Jing Zhang

A Multilayer Sparse Representation of Dynamic Brain Functional


Network Based on Hypergraph Theory for ADHD Classification . . . . . . . . . 325
Yuduo Zhang, Zhichao Lian, and Chanying Huang

Stress Wave Tomography of Wood Internal Defects Based on Deep


Learning and Contour Constraint Under Sparse Sampling . . . . . . . . . . . . . . 335
Xiaochen Du, Jiajie Li, Hailin Feng, and Heng Hu

Robustness of Network Controllability Against Cascading Failure . . . . . . . . . 347


Lv-lin Hou, Yan-dong Xiao, and Liang Lu

Multi-modality Low-Rank Learning Fused First-Order and Second-Order


Information for Computer-Aided Diagnosis of Schizophrenia . . . . . . . . . . . . 356
Huijie Li, Qi Zhu, Rui Zhang, and Daoqiang Zhang

A Joint Bitrate and Buffer Control Scheme for Low-Latency


Live Streaming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
Si Chen, Yuan Zhang, Huan Peng, and Jinyao Yan

Causal Discovery of Linear Non-Gaussian Acyclic Model


with Small Samples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
Feng Xie, Ruichu Cai, Yan Zeng, and Zhifeng Hao

Accelerate Black-Box Attack with White-Box Prior Knowledge . . . . . . . . . . 394


Jinghui Cai, Boyang Wang, Xiangfeng Wang, and Bo Jin

A Dynamic Model + BFR Algorithm for Streaming Data Sorting . . . . . . . . . 406


Yongwei Tan, Ling Huang, and Chang-Dong Wang

Smartphone Behavior Based Electronical Scale Validity


Assessment Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
Minqiang Yang, Jingsheng Tang, Longzhe Tang, and Bin Hu

Discrimination Model of QAR High-Severity Events Using


Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 430
Junchen Li, Haigang Zhang, and Jinfeng Yang

A New Method of Improving BERT for Text Classification . . . . . . . . . . . . . 442


Shaomin Zheng and Meng Yang

Author Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453


Deep IA-BI and Five Actions in Circling

Lei Xu1,2(B)
1
Centre for Cognitive Machines and Computational Health (CMaCH), SEIEE,
Shanghai Jiao Tong University, Minhang, Shanghai, China
[email protected]
2
Neural Computation Research Centre, Brain and Intelligence Sci-Tech Institute,
ZhangJiang National Lab, Shanghai, China
http://www.cs.sjtu.edu.cn/~lxu/

Abstract. Deep bidirectional Intelligence (BI) via YIng YAng (IA) sys-
tem, or shortly Deep IA-BI, is featured by circling A-mapping and I-
mapping (or shortly AI circling) that sequentially performs each of five
actions. A basic foundation of IA-BI is bidirectional learning that makes
the cascading of A-mapping and I-mapping (shortly A-I cascading)
approximate an identical mapping, with a nature of layered, topology-
preserved, and modularised development. One exemplar is Lmser that
improves autoencoder by incremental bidirectional layered development
of cognition, featured by two dual natures DPN and DCW. Two typical
IA-BI scenarios are further addressed. One considers bidirectional cogni-
tion and image thinking, together with a proposal that combines theories
of Hubel-Wiesel’s versus Chen’s. The other considers bidirectional inte-
gration of cognition, knowledge accumulation, and abstract thinking for
improving implementation of searching, optimising, and reasoning. Par-
ticularly, an IA-DSM scheme is proposed for solving a doubly stochastic
matrix (DSM) featured combinatorial tasks such as travelling salesman
problem, and also a Subtree driven reasoning scheme is proposed for
improving production rule based reasoning. In addition, some remarks
are made on relations of Deep IA-BI to Hubel and Wiesel theory, Sperry
theory, and A5 problem solving paradigm.

Keywords: Bidirectional · Cognition · Image thinking · Abstract


thinking · Inferring · Reasoning · Topology · Optimising · Production
rule

1 Deep Bidirectional Intelligence

Bidirectional intelligence (BI) recently are overviewed in Ref. [57]. As illustrated


in the centre of Fig. 1(b), the bi-direction is featured by a circling similar to the

L. Xu—Supported by the Zhi-Yuan Chair Professorship Start-up Grant WF220103010


from Shanghai Jiao Tong University, and National New Generation Artificial Intelli-
gence Project 2018AAA0100700.
c Springer Nature Switzerland AG 2019
Z. Cui et al. (Eds.): IScIDE 2019, LNCS 11935, pp. 1–21, 2019.
https://doi.org/10.1007/978-3-030-36189-1_1
2 L. Xu

one of ancient Chinese yIng yAng logo1 . First, real bodies or patterns in the
Actual world or shortly A-domain (the domain that is visible or named yAng)
are mapped along the inward direction into the inner coding domain or shortly
I-domain (the domain that is invisible or named yIng). This mapping transfers
real body to information coding seed like a yAng or male animal (named A-
mapping), performing Abstraction tasks such as perception, recognition, and
cognition. Second, codes, concepts or symbols in the inner I-domain are mapped
along the outward direction named I-mapping (a mapping from coding seed to
real body in A-domain, i.e., acting like a yIng or female animal) to make Inference
tasks that may be categorised into image thinking and abstract thinking.
The A-mapping and then I-mapping circling (or shortly A-I circling) performs
each of five actions sequentially. The first action is acquiring data X and features
that describe X. As to be further addressed in Sect. 2, the process is featured
by either deep neural networks (NN) or convolutional NN (CNN) that proceeds
layer by layer to perform hierarchical extraction and abstraction. The 1981 Nobel
prize winners Hubel and Wiesel [24,25] developed a theory that explains how this
proceeds in a manner of hierarchy as illustrated in Fig. 1(a). This H-W theory
has greatly impacted the subsequent efforts in the studies of artificial intelligence
and neural networks, including the recent more than decade long bloom of deep
learning driven AI studies.
The second action performs abstraction by an inner code Y that indicates one
among labels or concepts, allocates chances among candidates, gets a dimension-

Fig. 1. Deep IA-BI (i.e., bidirectional Intelligence via YIng YAng system)
1
“Ying” is spelled “Yin” in the current Chinese Pin Yin system that could be back-
tracked to over 400 years from the initiatives by M. Ricci and N. Trigault. But, the
length of ‘Yin’ lost its harmony with Yang, thus ‘Ying’ is preferred since 1995 [42].
Deep IA-BI and Five Actions in Circling 3

ally much reduced code as an inner representation of X, and even forms an icon
or a subtree structure. This action not only performs perception and recognition,
but also provides cognitions and evidences to the third action Inner-memory that
accumulates knowledge and evidence.
The knowledge and evidence come from two sources. One acquires, mem-
orizes, and organises various knowledges via education, e.g., in a formulation
of knowledge graph. The other not only gets evidences to these organised and
structured knowledges, but also adds in concepts and cognitions from the second
action.
The fourth action inference may be jointly activated by the status of its
previous two actions and possibly some short-cut signals as well, which performs
either or both of following two manners:
Image Thinking. It is also closely related to what called concrete thinking
elsewhere, and is usually referred to thought process based on dependencies
among either or both of real/concrete things and their mappings/images in the
I-domain of brain. The key point is specific, concrete, and detailed. Cognition
via A-mapping trained by supervised deep learning is just one scenario. Another
scenario happens when there is no teaching label. Whether the perceived Y by
the A-mapping X → Y makes a sense is verified by checking if X̂ generated
by the I-mapping Y → X̂ approximates X as closely as possible. Based on the
cognition by X → Y , the I-mapping Y → X̂ performs various tasks that map
X of input patterns into Z of simplified patterns, enriched patterns, and trans-
formed patterns, as well as imaginary and creative patterns. Typical examples
include language to language, text to image, text to sketch, sketch to image,
image to image, 2D image to 3D image, past to future, image to caption, image
to sentence, music to dance, ..., etc. All these image thinking tasks are performed
by an A-I cascading and featured by an information flow that varies layer by
layer in a topological preservation manner, as if displayed by an image sequence.
Other details are referred to Sect. V.A in Ref. [57].
Abstract Thinking. It is also closely related to what called rational thinking,
and is usually referred to thought process based on either or both of causal rela-
tions among events and logical relations among concepts in a broad, general and
non-specific sense, typically described by symbolic or graphical representations.
Typical examples are searching, selecting, optimising, reasoning, and planning,
which are performed in a discrete space of individual or combinatoric choices.
Traditionally, abstract thinking is performed by I-mapping that searches among
discrete space according to knowledges and evidences accumulated and organised
in the third action, which usually encounters intractable computing difficulties.
Exemplified by AlphaGo [38] and AlphaGoZero [39], searching performances can
be significantly improved with help of one appropriate A-mapping via deep neu-
ral networks that provides either or both of probabilistic selecting polices and
heuristic values.
Following either or both of image thinking and abstract thinking, the fifth
action is implementation of communication (verbal, writing, gesturing, postur-
ing, etc.) and control (motoring, monitoring, steering, etc.) as desired.
4 L. Xu

As to be further addressed in the last section, the above A5 featured AI


circling is actually a further development of A5 problem solving [51], which was
motivated from analysing the key ingredients of randomised Hough transforma-
tion and multi-learner based problem solving [49,50]. Also, one early exemplar
of IA-BI system is Bayesian Ying Yang (BYY) system [42,51]. Moreover, the
principle that the A-I cascading approximates one identical mapping is just
one of special cases that are addressed by Bayesian Ying Yang learning theory
[42,47,48,51–54]. Recently, it is further extended to cover abstract thinking in
a general thesis named BYY intelligence potential theory (BYY-IPT) [57].
In subsequent sections, further insights on IA-BI and A5 are addressed.
The next section backtracks the advances on the A-I cascading from the later
eighties and the early nineties to some recent studies, with insights on the lay-
ered, topology-preserved, and modularly development of bidirectional learning.
Section 3 further addresses bidirectional cognition and image thinking from a per-
spective of Hubel-Wiesel versus Chen theories, with one combined scheme sug-
gested. Section 4 further considers bidirectional integration of cognition, knowl-
edge accumulation, and abstract thinking, with insights and suggestions on
improving implementation of searching, optimising, and reasoning. Particularly,
an IA-DSM scheme is proposed for solving those doubly stochastic matrix (DSM)
featured combinatorial tasks such as travelling salesman problem. Also, a Sub-
tree driven reasoning scheme is proposed for improving production rule based
reasoning. In the last section, after a summary, remarks are made on relations
of Deep IA-BI to the split-brain theory by the 1981 Nobel prize winner R.W.
Sperry and A5 problem solving paradigm proposed more than two decades ago.

2 Layered, Topology-Preserved, and Modularly


Developing
As mentioned above, the basic foundation of bidirectional intelligence is obtained
from a bidirectional learning that makes the A-I cascading approximates one
identical mapping. Early efforts along such a line can be backtracked to the
later eighties and the early nineties in the last century [1,3,7,12], under the
name Autoencoder (shortly AE or called auto-association). As illustrated in
Fig. 2(a), AE encodes X into a vector Y in a lowered dimension by a multilayer
net called encoder and decodes Y back to X̂ by a multilayer net called decoder.
The decoder shares the same structure of the encoder in a mirror architecture
and acts as an inverse of the encoder.
The encoder and the decoder jointly makes X̂ approximate X as close as
possible, which may be regarded as a principle of primitive cognition (PC) that
justifies the encoding X → Y is a meaningful abstraction via requiring Y → X̂
by a mirror architecture to perform an inverse process, such that X can be
perceived or understood via Y .
Another early bidirectional learning example is the Least Mean Square Error
Reconstruction (Lmser) self-organizing network that was first proposed in 1991
[40,46], which performs the PC principle by a mirror architecture similar to AE
Deep IA-BI and Five Actions in Circling 5

but differs in that the encoder and decoder are overlapped, resulting in several
favourable characteristics, see further details in Table 3 of [57].
First, neurons per layer in the encoder and the decoder are bidirectionally
connected pairwisely to the corresponding neurons on the corresponding layer,
with each neuron taking a dual role both in the encoder and decoder as illus-
trated in Fig. 2(d), which is referred as Duality in Paired Neurons (DPN) [57].
Specifically, this duality leads to the following three extensions of autoencoder.
(a) Fast-lane Lmser: as illustrated in Fig. 2(b), neurons per layer in the encoder
are directed pairwisely to their counteractions in the decoder, in a role similar
to skip connections in U-net [35], ResNet [16] and DenseNet [22].
(b) Feedback Lmser: as illustrated in Fig. 2(c), neurons per layer in the decoder
are directed pairwisely to their counteractions in the encoder, in a role similar
to those in recurrent neural networks (RNN) for enhancing robustness.
(c) Lmser and flexible Lmser: as illustrated in Fig. 2(d), each neuron per layer j
enforces the activity v (j) in the encoder and the activity u(j) in the decoder
to become identical, which implies that the encoding action from the j-th
layer up to the top and the decoding action from the top down to the j-
th layer jointly perform an identical mapping v (j) = u(j) . In other words,
in addition to seeking one global identical mapping X = X̂ as AE does,
identical mappings are also sought for distributed implementation of the PC
principle, not only from the j-th layer to the top and then down to the j-th
layer, but also between the j-th layer and the j + 1-th layer.
Second, bidirectional layered development of cognition is also considered in
LMSER by another dual nature called Duality in Connection Weights (DCW),
with same connecting weights between every two consecutive layers taking a

Fig. 2. Lmser differs from Autoencoder. (a) Autoencoder without the dualities DPN
and DCW, (b) Fast-lane Lmser with DPN only in skip direction, (c) Feedback Lmser
with DPN only in feedback direction, (d) Lmser with DPN and DCW.
6 L. Xu

dual role both in the encoder and in the decoder. From the j-th to the j + 1-th
layer, we have Aj = WjT in Lmser as illustrated in Fig. 2(d) while Aj is learned
without the constraint Aj = WjT as illustrated in Fig. 2(e). When Wj is an
orthogonal matrix, Aj = WjT approximately acts as its pseudo-inverse such that
Aj WjT = Wj WjT . In other words, DCW enhances distributed implementation of
the PC principle consecutively from the j-th to the j + 1-th layer.
Insights may also come from an incremental bidirectional layered develop-
ment of cognition, as illustrated in Fig. 2(b). Perception and learning start at the
bottom layer, i.e., one layer Lmser that learns templates of feature extraction, as
demonstrated empirically in 1991 [40,46], with details referred to a recent rein-
vestigation [23] and a systematical survey [57]. Then, another layer is topped
on the learned one layer Lmser, and learning is further made on the resulted
two layer Lmser, . . . , so on and so forth. This procedure is similar to learning
stacked RBMs [18,19], as illustrated in Fig. 3(c). The importances of the DPN
and DCW dualities may be interestingly backtracked from Hinton’s progresses
on bidirectional learning from Helmholtz machines [11,17] that also considers an
architecture that locates between AE and Lmser (i.e., with DPN but without
DCW) to stacked RBMs that share both the DPN and DCW dualities.

Fig. 3. Incremental bidirectional layered development of cognition. (a) Mathematical


details of Lmser and flexible Lmser, (b) Perception and learning start at the bottom
layer, and next another layer is topped on it, (c) Stacked RBMs and hierarchical rep-
resentation developed from lower to higher layers to detect features from local ones to
global ones [18, 19], (d) An analog to light-ray propagation in layered media [55].
Deep IA-BI and Five Actions in Circling 7

If we have samples of Y too, incremental bidirectional layered development


may also be made from the top layer down by back propagation supervised
learning technique layer by layer for both the direction X → Y and the direction
Y → X, which may be coordinated via the DCW constraints Aj = WjT in both
the directions. In general, as illustrated in Fig. 2(d), information flows of forward
propagation versus backward propagation are actually coupled, in analog to light
propagation via layered media [55].
Third, bidirectional layered development of cognition is also accompanied
with another advantage that helps to preserve the relation of neighbourhood
and topology, as well as the hierarchy of abstract concepts. It was addressed in
[55] that a large number of layers help to accommodate a hierarchy and thus to
preserve these relations. Also, it follows from Fig. 6 and the last paragraph of [56]
that topology can be preserved by a multilayer net cascaded a typical three step
structure, that is, FAN-in linear summation activates a post-nonlinear function
and then FAN-out propagates to the neurons of the next layer, as long as the
post-nonlinear function satisfies some nature that is held by a classic sigmoid
function or a typical LUT function. Performing the PC principle, even one layer
Lmser makes inner neurons become independence [40,46], which helps to pre-
serve the relation of neighbourhood and topology. Moreover, as discussed above,
multilayer Lmser performs distributed PC principle, which not only enhances
preservation of neighbourhood and topology, but also facilitates to form the
hierarchy of concepts via conditional independence. The importance of this pre-
serving nature will be further addressed in the next section.
Last but not the least, the DPN and DCW dualities make modular devel-
opment of cognition too, with help of checking the discrepancy either between
the bottom up perception v (j) and the top-down reconstruction u(j) or between
the bottom up X (j) and the top-down X̂(j+1) . As recently addressed by the last
(10)–(14) items in Section 2 of Ref. [57], checking whether such discrepancies are
bigger than some pre-specified thresholds will allocate inputs to one of multiple
Lmser networks in a pipeline or a mixture.

3 Deep IA-BI Cognition and Image Thinking: From


Hubel-Wiesel vs Chen to One Combined Scheme

Started from early fifties of the last century, the main stream studies on percep-
tion and cognition proceed along the direction of feature detector hypothesis on
visual information process, exemplified by the 1981 Nobel Prize winners Hubel
and Wiesel who developed a feature detection theory [24,25]. They found that
some neurons called simple cells that fires rapidly when presented with lines at
one angle, while others responded best to another angle, and also complex cells
that detect edges regardless of where they were placed in the receptive field and
could preferentially detect motion in certain directions. Feature detection pro-
ceeds from detecting direction at the bottom gradually up to organise into more
complicated patterns in a manner of hierarchy, as illustrated in Fig. 1(a).
8 L. Xu

This H-W theory has greatly impacted the subsequent efforts on modelling of
intelligence in the studies of artificial intelligence and neural networks, roughly
summarised into three streams as follows:

(1) The early computing power was very limited in the later seventies and the
early eighties to support the demands of AI studies. Marr and Poggio [30]
proposed a simplified scheme of only three layers, which got popularised in
the AI literature during the eighties of the last century. However, recent
advances of deep learning for computer visions have actually abandoned this
stream.
(2) Fukushima is the first who attempted to build up a computational model
that is loyal to the H-W scheme, proposed Cognitron [13] and then developed
into Neocognitron [14], in which equations for S-cells and C-cells are both
provided with the connections for S-cells modified by learning while ones for
C-cells pre-fixed [15]. Neocognitron may be regarded as a junior version of
the current convolutional neural networks (CNN) [26,27]. The convolutional
layers of CNN share the same point as S-cells that aims at recognising stimu-
lus patterns based on the geometrical similarity (Gestalt), while the pooling
layers of CNN like C-cells. The key difference is that Neocognitron modifies
weights by self-organising while CNN modifies weights by back-propagation.
(3) There are also two roads on how feature detectors or receptive fields are
developed. One considers Gabor functions or other wavelets, with parameters
estimated from data. The other considers how neurons in a simple model
shown in Fig. 4(a) develop receptive fields [2,6,29,36,37,40,46]. At the first
stage, it has been found that receptive fields come from the evolution of
weight vectors by a Hebb-type learning. A typical example is Linsker’s feed-
forward linear network [29] with local connections from a lower layer to its
next layer, which is functionally similar to a convolutional layer. Each neuron
in this linear network is a special case of the one illustrated in Fig. 4(a) at
simply a linear activation s(r) = r. It follows [36,37] that such a type of
evolution may come from either of principal component analysis (PCA) and
maximum information transfer.

However, learning linear neurons break coupled symmetry poorly. It was


found firstly in [40] that one layer Lmser with a sigmoid nonlinearity s(r) = r
makes coupling of detectors reduced, acting like independent component analysis
(ICA) that was latter further studied with improved feature detectors [2]. One
layer Lmser learning rule shown in Fig. 4(c) is closely related to the learning rule
used in the classical stacked RBMs as illustrated in Fig. 4(b) [18,19], sharing
a common part (see the boxes in red coloured dashed lines). Also, performing
learning in stacked RBMs from lower layer to higher ones developed a hierarchy
of feature maps from local ones to more global ones, as illustrated in Fig. 4(c).
Rooted in Hubel-Wiesel, all the above studies are featured by local-to-global
and bottom up development of cognition. In contrast, Chen believes [4,5] that
perceptual process is from global to local: wholes are coded priori to analyses
of their separable properties or actions, following the perceptual organisation
Deep IA-BI and Five Actions in Circling 9

Fig. 4. Development of feature detectors or receptive fields. (a) simple neuron that
develops receptive field, (b) feature detector by one layer RBM, (c) Lmser and RBM
share a same structure and also a common term (i.e., two red dashed boxes) in learning.
(Color figure online)

of Gestalt psychology. Proceeding far beyond the notation “whole is more than
the simple sum of actions”, Chen suggests that “holistic registration is priori to
local analyses” and emphasises topological structure in visual perception. This
‘priori’ has two meanings. One implies that global organisations, determined by
topology, are the basis that perception of local geometrical properties depends
on. The other is that topological perception (based on physical connectivity)
occurs earlier than the perception of local geometrical properties. Though there
have been a number of evidences that support this global precedence, there has
no computational approach to successfully implement or even illustrate, which
remains an attractive challenge.

Fig. 5. IA bidirectional scheme combines local-to-global and global-to-local theses. (a)


fast-lane for perceiving icon, (b) bottom up perception, (c) bidirectional implementa-
tion, (d) & (e) top-down reconstruction jointly driven by coding vector and icon, plus
fast lane for top down attention.
10 L. Xu

As illustrated in Fig. 5 and also Fig. 1(b), we explore a computational scheme


that combines the local-to-global bottom up development and the global-to-local
top down attention. In general, each inner code consists of three ingredients. One
is a label or a string of symbols that perceives the current input into the corre-
sponding concept and a parsing tree that organises several concepts. The second
is a vector of attributes that describe the input in a high dimensional feature
space. These two have been widely encountered in the conventional studies. The
third ingredient considers the simplest representation of spatial, structural, and
topological dependence, as illustrated on the top of Fig. 5(d) and here shortly
called icon. In the existing studies, this ingredient has been rarely considered.
Instead, at least one layer of a fully connected networks is topped on CNN to
output a vector as an inner code, even when we use CNN networks on images.
Following Chen’s view, such icons are believed to take an important role in
image thinking. As illustrated in Fig. 5(a) & (b) & (c), the bottom up perception
may have a fast lane to perceive such icons, such that “holistic registration is
priori to local analyses”, in a coordination of certain accumulated knowledge.
A rough example may be reducing a high resolution image into a much lower
resolution, e.g., in a way similar to one recent study [28]. Further study may
considered by adjusting icons for initialisation via bidirectional learning.
On the other hand, the top-down reconstruction of images may be jointly
driven by a coding vector and an icon that acts as a structural and topological
priori, again in a coordination of certain accumulated knowledge. In implemen-
tation, such icons go down directly via CNN blocks in place of a usual fully
connected networks. Also, there may be a fast lane skip connections (e.g, as
illustrated in Fig. 5(d) in Ref. [57]) to provide some top down attentions, as
illustrated in Fig. 5(c) & (d) & (e).
This combined scheme acts as the fundamental part of IA-BI in Fig. 1, which
echoes Chinese thought in term of not only the holistic Chinese philosophy that is
advocated recently by efforts on machine learning [42,47,48,51–54], but also Chi-
nese characters that favours images in thinking and communication. There have
been exemplified efforts by Chinese scientists in recent decades. Qian thought
that image thinking plays a leading role in creative process [34]. Pan proposed
a synthesis reasoning, expounded the relationship with image thinking [31], as
well as Chen believed the global to local principle for cognition [4,5].

4 Deep IA-BI Cognition and Abstract Thinking:


Searching, Optimising, and Reasoning

We start with considering abstract thinking in a general sense that deals with
symbolic or graphical representations with help of discrete mathematics, featured
by typical tasks as illustrated in Figs. 6 and 7.
The first task is search vs selection. The simple case is selecting among a finite
number of individuals. The same situation can be found in making a decision
or classifying/allocating among a number of choices. Beyond the simple case,
search is performed by sequential decision to find a path, i.e., making selection
Deep IA-BI and Five Actions in Circling 11

per step as illustrated in Fig. 6(a). Typically, search is finding an optimal path in
a tree or spanning a tree as illustrated on the top of Fig. 6(b), which are typically
encountered in solving travelling salesman problem (TSP) and attributed graph
matching (AGM), as well as alphaGo.
The second task is satisfaction vs optimisation. The target of searching can
be not only one or more nodes but also one or more paths, such that some
specifications or conditions are satisfied. Moreover, we may select an optimal
one or ones among all the satisfactory ones. Traditionally, tree searching usually
encounters intractable computing difficulties. Exemplified by AlphaGo [38] and
AlphaGoZero [39], searching performances can be significantly improved with
help of an appropriate A-mapping via deep neural networks that provides either
or both of selecting probabilities and value heuristics. Furthermore, optimisation
may also be made on other tasks of graph analyses, illustrated in Fig. 6(c).
Here, we sketch one IA scheme for solving a doubly stochastic matrix (DSM)
featured combinatorial task, or shortly IA-DSM optimisation, e.g., the TSP task
illustrated in Fig. 6(c). The solution is featured by the following problem
min E(V ),
V
 
V = [vij ], i, j = 1, · · · , n, vij = 0, 1, vij = 1, vij = 1, (1)
j i

where vij = 1 indicates going from the ith city to the jth city, and there are only
n ones with vij = 1 that defines a TSP tour. The problem is finding a shortest
TSP tour such that a cost E(V ) is minimised, which is typically NP hard.
Since 1994, a class of neural networks algorithms (named LT algorithms)
[8–10,41,43] have been developed by iteratively solving a mirror of Eq. (1) with

Fig. 6. Bidirectional searching. (a) simple selection and depth first search (DFS), (b)
IA-DSM scheme, (c) travelling salesman problem, (d) other graph analyses.
Another random document with
no related content on Scribd:
be determined.

Jeduthun] See note on xvi. 41.

Berechiah the son of Asa, the son of Elkanah] Not mentioned in


Nehemiah He probably represented the Kohathite division of the
singers; compare vi. 33‒38 (18‒23 Hebrew), where the name
Elkanah occurs several times in the genealogy of the Kohathites.

the villages of the Netophathites] Compare Nehemiah xii. 28, 29


(Revised Version), whence it appears that these villages were close
to Jerusalem. The exact site is uncertain.

17‒27 (compare Nehemiah xi. 19; 1 Chronicles xxvi. 1‒19).


Organisation and Duties of the Porters (Doorkeepers).

17‒27. The same subject is treated in xxvi. 1‒19, and this fact
has been urged as an argument for the view that chapter ix. is an
addition to the Chronicler’s work (see Introduction p. xxiii). But it is
also reasonable to suppose that the Chronicler would here give a
register of inhabitants of Jerusalem (which could not be included in
the list of the separate tribes), and such a register would probably
give a survey of the Levitical classes.

The verses present on analysis several confusing features, see


notes on verses 17, 22, 25, 33.

¹⁷And the porters; Shallum, and Akkub, and


Talmon, and Ahiman, and their brethren:
Shallum was the chief;
17. And the porters] Render, doorkeepers as in xvi. 38 and xxvi.
1. In Solomon’s Temple there were “keepers of the threshold,” three
in number (2 Kings xxv. 18), priests in rank (2 Kings xii. 9).

A distinction between the doorkeepers and the Levites (verse 14)


seems to be implied, but in verses 19, 26 the doorkeepers, or at any
rate their leaders, are called Levites (compare Nehemiah xi. 15, 19
with 1 Chronicles xxvi.). The supposed distinction may have died out
before the Chronicler’s period, or perhaps earlier and later stages
are reflected in the chapter (see also the note on verse 26).

Shallum...Ahiman] These two names are absent from Nehemiah


xi. 19 together with the clause Shallum was the chief. This omission
of all reference to Shallum must be accidental.

Shallum, and Akkub, and Talmon] The three names represent


families, not individuals; compare Ezra ii. 42 = Nehemiah vii. 45,
where the fuller form is given, the children of Shallum, ... the children
of Talmon, the children of Akkub.... These names persist in the five
lists of porters which refer to post-exilic times; Ezra ii. 42 =
Nehemiah vii. 45; Nehemiah xi. 19 = 1 Chronicles ix. 17 (Shallum is
to be supplied in Nehemiah from Chronicles); Nehemiah xii. 5
(Meshullam = Shallum). For the Chronicler’s traditions of Levites,
singers, and doorkeepers of the Davidic period, see chapters xv.,
xxiv. ff.

Ahiman] Elsewhere in the Old Testament this name occurs only


among the names of the sons of Anak, and it is probable that the
Chronicler (or some scribe) made here an error of transcription, and
that Ahiman has arisen from the word aheihem “their brethren” which
follows. A fourth name was probably given in the original text, for see
verse 26.

¹⁸who hitherto waited in the king’s gate


eastward: they were the porters for the camp
of the children of Levi.
18. who] i.e. Shallum (verse 17), called Shelemiah in xxvi. 14 (=
Meshelemiah, xxvi. verse 1). As mentioned above, a family is meant.

hitherto] i.e. to the time of the Chronicler.


the king’s gate eastward] That the king had an entrance into the
Temple named after him appears from 2 Kings xvi. 18, and that this
gate was on the east from Ezekiel xlvi. 1, 2.

for the camp of the children of Levi] i.e. the Temple; but the
phrase, which is derived from Numbers ii. 17, in its original context of
course signifies the Tabernacle of the Mosaic period. Doubtless it is
used with the implication that the institution of the gatekeepers dated
back to that age: compare verse 19 ad fin., and contrast verse 22.

¹⁹And Shallum the son of Kore, the son of


Ebiasaph, the son of Korah, and his brethren,
of his father’s house, the Korahites, were over
the work of the service, keepers of the gates ¹
of the tabernacle ²: and their fathers had been
over the camp of the Lord, keepers of the
entry;
¹ Hebrew thresholds. ² Hebrew Tent.

19. over the camp of the Lord, keepers] We might expect the
reference to the Temple or Tabernacle to be continued; but, as
nothing is said in the Pentateuch of “keepers of the entry to the
tabernacle,” probably the entry to the camp, not to the tabernacle, is
meant in the present phrase. With this view agrees the mention of
Phinehas (verse 20), for it apparently was the profanation of the
camp in general, not of the tabernacle, which Phinehas avenged
(Numbers xxv. 6‒8), thus earning a blessing (Numbers xxv. 11‒13).

²⁰and Phinehas the son of Eleazar was ruler


over them in time past, and the Lord was with
him.
20. and the Lord was with him] Render, May the Lord be with
him, a pious exclamation, customary on mentioning the name of a
famous and righteous person deceased. The phrase is common in
later Jewish literature; but this passage seems to be the earliest
instance of its use.

²¹Zechariah the son of Meshelemiah was


porter of the door of the tent of meeting. All
these which were chosen to be porters in the
gates ¹ were two hundred and twelve.
¹ Hebrew thresholds.

21. Zechariah the son of Meshelemiah] Compare xxvi. 2, 14,


according to which Zechariah’s watch was on the north.

the tent of meeting] The reference would be to the Mosaic tent, if


the verse be taken, as is natural, in close connection with verses 19,
20. If the verse be treated in conjunction with verse 22 it must refer
to the tent of the ark in David’s time. The ambiguity is perhaps
intentional.

²²These were reckoned by genealogy in their


villages, whom David and Samuel the seer did
ordain in their set ¹ office.
¹ Or, trust.

22. All these] Compare Ezra ii. 42 (= Nehemiah vii. 45);


Nehemiah xi. 19. The discrepancy in numbers between Chronicles
and Nehemiah and also between Nehemiah vii. and Nehemiah xi.
may be explained by supposing some difference in the manner of
reckoning or some difference in the period referred to.
in their villages] Compare verses 16 and 25.

whom David ... did ordain] The Chronicler attributes to David the
organisation of the priests (xxiv. 3), of the Levites (xxiii. 27; xxiv. 31),
of the singers (xxv. 1 ff.), and of the doorkeepers (in this passage). It
has been thought that this verse is at variance with verses 18, 19,
where the Mosaic origin of the gatekeepers seems to be implied. But
in answer it may be said that the Chronicler is guilty of no
inconsistency in ascribing the origin of the doorkeepers to the
Mosaic period and saying here that David and Samuel “ordained
them in their set office,” for the phrase refers, not to their origin, but
to their organisation. For another suggestion see below on verse 26.

Samuel] The association of Samuel with the organisation of the


sanctuary is confined to this passage, and is a significant illustration
of the working of late Jewish thought, which was little concerned with
historic probability and much with edification. The tradition has
probably arisen from the remark in 1 Samuel iii. 15, that Samuel
“opened the doors of the house of the Lord.” As Samuel died before
the reign of David, the Chronicler doubtless does not intend to
represent him as contemporary with David in the organisation of the
Temple, but probably supposes that Samuel’s work was done in
connection with the tent, which according to the Chronicler was
located in Gibeon (2 Chronicles i. 3).

the seer] For the title, xxvi. 28, xxix. 29; 1 Samuel ix. 9; and
compare 2 Chronicles xvi. 7.

in their set office] or in their trust; i.e. in their responsible


positions.

²³So they and their children had the oversight


of the gates of the house of the Lord, even
the house of the tabernacle ¹, by wards.
¹ Hebrew Tent.
23. the house of the tabernacle] margin Tent. The phrase
designates the period before the building of the Temple.

²⁴On ¹ the four sides were the porters, toward


the east, west, north, and south.
¹ Hebrew Towards the four winds.

24. On the four sides] Fuller details are given in xxvi. 14‒18.

²⁵And their brethren, in their villages, were to


come in every seven days from time to time to
be with them:
25. in their villages] No special villages inhabited by porters are
mentioned, but perhaps porters as well as singers dwelt in the
“villages of the Netophathites” (verse 16; Nehemiah xii. 28, Revised
Version).

²⁶for the four chief porters, who were Levites,


were in a set office, and were over the
chambers and over the treasuries in the house
of God. ²⁷And they lodged round about the
house of God, because the charge thereof
was upon them, and to them pertained the
opening thereof morning by morning.
26. the four chief porters, who were Levites] It seems clear from
this verse (and from the structure of the chapter, compare verses 10,
14, 17—as is pointed out in the note on verse 17) that the
doorkeepers were not, as a body, Levites; and according to verse 25
they dwelt outside Jerusalem, whilst their leaders (verse 27) were
within the city. Perhaps this distinction between the leaders and the
rank and file could be used to explain the supposed inconsistency (if
any exists—see above verse 22, note on whom David ...) between
verses 19 and 22, as regards the tradition of origin: it might be said
that whilst the leaders claimed that their office dated from the time of
Moses (verse 19), the rank and file traced their institution to David
(verse 22). (In 2 Chronicles xxxiv. 9 Levites appear exercising the
duties of doorkeepers, but this does not prove that all doorkeepers
were Levites.)

chambers] i.e. store-chambers in which tithes and sacred vessels


were kept; compare 2 Chronicles xxxi. 5, 11, 12; Nehemiah xiii. 4‒9:
in verse 33 of this chapter they seem to be in use also as rooms in
which Levites could dwell. The chambers were probably built as
outbuildings round the Court of the Temple; compare xxiii. 28, xxviii.
12.

28, 29.
Duties of the Levites.

²⁸And certain of them had charge of the


vessels of service; for by tale were they
brought in and by tale were they taken out.
28. And certain of them] The reference is to the Levites. The
contents of verses 28, 29 clearly refer to Levitical duties (compare
xxiii. 29), and the transition from porters to Levites is made easier by
the fact that the four porters last mentioned (verses 26, 27) are
Levites. Some commentators hold that the paragraph dealing with
the duties of the Levites begins in verse 26 with the words “And they
were over,” etc.

²⁹Some of them also were appointed over the


furniture, and over all the vessels of the
sanctuary, and over the fine flour, and the
wine, and the oil, and the frankincense, and
the spices.
29. compare xxiii. 29.

30.
A Priestly Duty.

³⁰And some of the sons of the priests prepared


the confection of the spices.
30. the sons of the priests] i.e. “members of the priesthood,
priests.” Compare 2 Chronicles xxv. 13, “the soldiers of the army”
(literally “the sons of the troop”) and the common expression “the
sons of the prophets.”

confection] (For the word, compare 1 Samuel viii. 13, Revised


Version text and margin) This “ointment” was peculiarly holy (Exodus
xxx. 23‒25). The Levites might have charge of the oil and spices
(verse 29), but only the priests might make the confection.

31, 32.
Other Levitical Duties.

³¹And Mattithiah, one of the Levites, who was


the firstborn of Shallum the Korahite, had the
set office over the things that were baked in
pans.
31. who was the firstborn of Shallum] In xxvi. 2 the firstborn of
Meshelemiah (= Shallum) is called Zechariah. Probably Mattithiah
and Zechariah represent each a household belonging to an elder
branch of the great family of Shallum.
³²And some of their brethren, of the sons of
the Kohathites, were over the shewbread, to
prepare it every sabbath.
32. the shewbread] Literally the bread of the Row (or of the Pile),
for it had to be arranged in order before the Lord (Leviticus xxiv. 6).
The Chronicler prefers this term to the older “Bread of the Presence”
(i.e. of Jehovah). See more fully Driver, Exodus, pp. 274, 275, in this
series.

to prepare it every sabbath] “Every sabbath he shall set it in order


before the Lord continually” (Leviticus xxiv. 8). In 2 Chronicles ii. 4
(= ii. 3, Hebrew) it is called the continual shewbread (literally “the
continual Row”).

³³And these are the singers, heads of fathers’


houses of the Levites, who dwelt in the
chambers and were free from other service:
for they were employed in their work day and
night. ³⁴These were heads of fathers’ houses
of the Levites, throughout their generations,
chief men: these dwelt at Jerusalem.
33. And these are] This verse may be intended as a conclusion to
verses 15, 16, for the names there given are those of singers;
compare Nehemiah xi. 17. On the other hand it may have been
intended as the heading of such a list as appears in vi. 33‒47 (= 18‒
32, Hebrew), the list itself having somehow been omitted.

day and night] Compare Psalms cxxxiv. 1; Revelation iv. 8.

35‒44 (= viii. 29‒38).


The Genealogy of the house of Saul.
³⁵And in Gibeon there dwelt the father of
Gibeon, Jeiel, whose wife’s name was
Maacah: ³⁶and his firstborn son Abdon, and
Zur, and Kish, and Baal, and Ner, and Nadab;
³⁷and Gedor, and Ahio, and Zechariah, and
Mikloth. ³⁸And Mikloth begat Shimeam. And
they also dwelt with their brethren in
Jerusalem, over against their brethren. ³⁹And
Ner begat Kish; and Kish begat Saul; and Saul
begat Jonathan, and Malchi-shua, and
Abinadab, and Eshbaal. ⁴⁰And the son of
Jonathan was Merib-baal; and Merib-baal
begat Micah. ⁴¹And the sons of Micah; Pithon,
and Melech, and Tahrea, and Ahaz. ⁴²And
Ahaz begat Jarah; and Jarah begat Alemeth,
and Azmaveth, and Zimri; and Zimri begat
Moza: ⁴³and Moza begat Binea; and Rephaiah
his son, Eleasah his son, Azel his son: ⁴⁴and
Azel had six sons, whose names are these;
Azrikam, Bocheru, and Ishmael, and
Sheariah, and Obadiah, and Hanan: these
were the sons of Azel.
See notes on viii. 29 ff. The passage serves here as an
introduction to the story of the death of Saul. Whether it is in its
original setting here or in viii. 29 ff., or possibly is original in both
chapters, there is not sufficient evidence to determine (see note on
viii. 29).
Chapters X.‒XXIX.
The Reign of David.
At this point the Chronicler begins his narrative of Israel’s history.
It commences abruptly with an account of the defeat and death of
Saul, which however is given not for its own interest, but to serve as
a brief introduction to the reign of David (chapter xi. ff.). Why does
the Chronicler choose to begin his narrative at this point, passing
over in silence (a) the Mosaic period, (b) the stories of Judges and of
1 Samuel i.‒xxx.? As regards (a) his silence is due to the
assumption that those for whom he writes are no less familiar than
he is himself with the account of the Mosaic age as presented by the
fully developed tradition of the Pentateuch. As for (b), his silence
probably arises neither from the difficulty of retelling the narratives of
Judges in accordance with his theory of the early history, nor yet
from the fact that they were doubtless familiar to his readers; but,
again, from a consideration of the central purpose of his work. His
theme is the Divine guidance of Israel’s destiny, and, since that
destiny had ultimately centred upon the fortunes of Jerusalem and
the worship maintained through its Temple, all else in Israel’s history
becomes of quite secondary importance. He begins therefore where
(for Israel) Jerusalem and the Temple began—with David, who
conquered the city and planned the Temple. The tales of the Judges,
of Samuel, and of David’s early life and his magnanimity toward Saul
(a tempting source for the exaltation of the character of the ideal
king), all these are logically ignored, since they lie outside the scope
of the Chronicler’s design.

Chapter X.
1‒12 (= 1 Samuel xxxi. 1‒13).
The Defeat, Death, and Burial of Saul.

1‒12. There are several variations between the text given here
and the text of 1 Samuel, to which attention will be called in the
notes below.

¹Now the Philistines fought against Israel:


and the men of Israel fled from before the
Philistines, and fell down slain ¹ in mount
Gilboa.
¹ Or, wounded.

1. in mount Gilboa] In the campaign of Gilboa the Philistines


showed new and skilful strategy. Instead of at once marching
eastward up the ravines which lead into Judah and Benjamin—in
which there was no room for their chariots (2 Samuel i. 6) to
manœuvre—they first marched northward along the sea-coast and
then turned eastward just before reaching Mount Carmel. This
movement brought them into the great fertile plain watered by the
Kishon, ground over which chariots could act with decisive effect. At
the north-east end of the plain rose the heights of Gilboa. When Saul
and his Benjamites advanced to meet the Philistines, the latter
succeeded in interposing themselves between the Israelite army and
its base in Benjamin—an easy achievement for an enemy who by his
chariots possessed a high degree of mobility. Saul was therefore
driven to take up his position on the north side of the plain on Mount
Gilboa, where he was attacked by the Philistines, probably from the
south-west on which side the slopes of the mountain are
comparatively gentle. The Israelites, cut off from their homes,
outmarched, outgeneralled, and probably outnumbered, were
speedily routed. The battle of Gilboa was won, like that of Hastings,
by cavalry (chariots) and archers (verse 3) against infantry, which
was obliged to stand on the defensive, under pain of being cut to
pieces if it ventured to attack. See G. A. Smith, Historical Geography
of the Holy Land, pp. 400 ff.

²And the Philistines followed hard after Saul


and after his sons; and the Philistines slew
Jonathan, and Abinadab ¹, and Malchi-shua,
the sons of Saul. ³And the battle went sore
against Saul, and the archers overtook him;
and he was distressed by reason of the
archers.
¹ In 1 Samuel xiv. 49, Ishvi.

2. Malchi-shua] This is the correct spelling, not Melchi-shua.

⁴Then said Saul unto his armourbearer, Draw


thy sword, and thrust me through therewith;
lest these uncircumcised come and abuse ¹
me. But his armourbearer would not; for he
was sore afraid. Therefore Saul took his
sword, and fell upon it. ⁵And when his
armourbearer saw that Saul was dead, he
likewise fell upon his sword, and died.
¹ Or, make a mock of me.

4. unto his armourbearer] Compare Judges ix. 54 (the death of


Abimelech). One function of an armourbearer was to give the “coup
de grâce” to fallen enemies (1 Samuel xiv. 13), but sometimes the
same office had to be executed for friends. Possibly the man refused
from fear of blood-revenge, which would be the more certainly
exacted in the case of the Lord’s Anointed, compare 1 Samuel ii. 22,
xxvi. 9 (so Curtis).

and abuse me] i.e. wreak their cruel will upon me; compare
Judges i. 6.

⁶So Saul died, and his three sons; and all his
house died together.
6. all his house] In Samuel “his armourbearer and all his men.”
The reference may be to Saul’s servants: his family was not
exterminated in this battle.

⁷And when all the men of Israel that were in


the valley saw that they fled, and that Saul
and his sons were dead, they forsook their
cities, and fled; and the Philistines came and
dwelt in them.
⁸And it came to pass on the morrow, when
the Philistines came to strip the slain, that they
found Saul and his sons fallen in mount
Gilboa.
7. that were in the valley] The “valley of Jezreel” (Hosea i. 5),
called in later times the “plain of Esdrelon” (Esdraelon), is meant.

forsook their cities] Among these was no doubt Beth-shan


(Beisan) “the key of Western Palestine” (see G. A. Smith, Historical
Geography of the Holy Land pp. 358 f.), where Saul’s body was
exposed (1 Samuel xxxi. 12).

and dwelt in them] Perhaps for a short while only, compare 2


Samuel ii. 9, “[Abner] made him (Ish-bosheth) king over ... Jezreel.”
Ish-bosheth, however, may have “ruled” only in acknowledgment of a
Philistine suzerainty.

⁹And they stripped him, and took his head,


and his armour, and sent into the land of the
Philistines round about, to carry the tidings
unto their idols, and to the people.
9. to carry the tidings unto their idols] Better, as in Samuel, “to
publish it in the house (or houses) of their idols”; compare the next
verse. The news was published by the exhibition of trophies of the
victory in the Philistine temples.

¹⁰And they put his armour in the house of their


gods, and fastened his head in the house of
Dagon.
10. in the house of their gods] In Samuel (more definitely) “in the
house (or houses) of Ashtaroth,” Ashtaroth being the plural of
Ashtoreth, a goddess, who seems here to bear a martial character.
(The name Ashtoreth is an artificial formation, the proper form being
Ashtarte. The vowels of the word bōshĕth, i.e. shame, were used for
the last two syllables in place of the true vowels; compare note on
viii. 33.) She was apparently consort of the Phoenician Baal (Judges
ii. 13, x. 6).

fastened his head in the house of Dagon (literally Beth-Dagon)] In


Samuel fastened his body to the wall of Beth-shan. The reading of
Chronicles is probably an arbitrary alteration made by the Chronicler
out of regard for 1 Samuel xxxi. 9, where it is related that the
Philistines cut off Saul’s head. It is just possible that the variation
points to a fuller original text containing all three statements—that
Saul’s armour was placed in the temple of Ashtarte, his head in the
“house of Dagon,” and his headless corpse fastened to the walls of
Beth-shan. Beth-shan is north-east of Gilboa, about four miles
distant from the Jordan, and about a day’s march (1 Samuel xxxi.
12) from Jabesh (verse 11), which was situated on the other side of
Jordan in Gilead.

¹¹And when all Jabesh-gilead heard all that the


Philistines had done to Saul,
11. Jabesh-gilead] See 1 Samuel xi. 1‒11; 2 Samuel ii. 4‒7.

¹²all the valiant men arose, and took away the


body of Saul, and the bodies of his sons, and
brought them to Jabesh, and buried their
bones under the oak ¹ in Jabesh, and fasted
seven days.
¹ Or, terebinth.

12. took away] i.e. from the walls of Beth-shan (so Peshitṭa).

to Jabesh] Samuel adds “and burned them there.” The Chronicler


omits this statement perhaps because he inferred that the bones
were not destroyed by this burning; compare 2 Samuel xxi. 12‒14
(the bones of Saul and Jonathan brought from Jabesh in David’s
reign and re-interred in the family sepulchre) or more probably
because burning was not a usual funeral rite among the Jews
(compare 2 Chronicles xvi. 14, note), and indeed was regarded with
abhorrence (compare Amos ii. 1).

under the oak] margin, terebinth. Large trees, being rare in


Palestine, frequently serve as landmarks; compare Judges iv. 5; 1
Samuel xxii. 6 (“tamarisk tree” Revised Version).

fasted seven days] Fasting involved abstinence from food during


daylight. David fasted “till the evening” in mourning for Saul (2
Samuel i. 12) and for Abner (2 Samuel iii. 35). The fast of Jabesh
was a sevenfold fast.

13, 14 (peculiar to Chronicles).


The Moral of the Overthrow of the House of Saul.

Such reflexions as these are characteristic of the Chronicler;


compare 2 Chronicles xii. 2 (note), xxii. 7, xxiv. 24, xxv. 27. They are
not so frequent in Samuel and Kings.

¹³So Saul died for ¹ his trespass which he


committed against the Lord, because of the
word of the Lord, which he kept not; and also
for that he asked counsel of one that had a
familiar spirit, to inquire thereby,
¹ Or, in.

13. his trespass] compare 2 Chronicles xxvi. 16. The reference is


to Saul’s sacrifice (1 Samuel xiii. 13, 14), and disobedience (1
Samuel xv. 23).

asked counsel ... spirit] i.e. of the witch of Endor, 1 Samuel xxviii.
7 ff.

¹⁴and inquired not of the Lord: therefore he


slew him, and turned the kingdom unto David
the son of Jesse.
14. and inquired not of the Lord] Compare xiii. 3. The Chronicler
presumably does not count inquiries made too late; compare 1
Samuel xxviii. 6 (Saul inquires of the Lord, but receives no answer).
Chapter XI.
1‒3 (= 2 Samuel v. 1‒3).
David made King over all Israel.

The remaining chapters of the first book of Chronicles are


occupied with the reign of David, who is represented as a king
fulfilling the Chronicler’s highest ideals of piety and prosperity. For
some general remarks on the difference between the picture thus
given and the David of Samuel see the note on xxviii. 1.

¹Then all Israel gathered themselves to


David unto Hebron, saying, Behold, we are thy
bone and thy flesh.
1. Then] Render And.

all Israel] Chronicles has nothing here corresponding to 2 Samuel


i.‒iv., chapters which cover a period of seven years (2 Samuel v. 5).
David’s earlier coronation by the men of Judah (2 Samuel ii. 4), the
reign of Ish-bosheth over Northern and Eastern Israel (2 Samuel ii. 8
ff.), and the “long war” (2 Samuel iii. 1) with the house of Saul are
omitted not of course because the Chronicler was ignorant of these
events (for see the references in verses 15, 17; xii. 1, 23, 29, etc.),
but for the reason set forth above in the head-note to chapters x.‒
xxix. The Chronicler’s account is perhaps deliberately adapted to
convey an impression of the ease with which the ideal David
ascends the throne of a united Israel; and, if we had not the narrative
in Samuel to help us, we should be left with a conception of the
period very different from the actual course of events. How strange,
too, would be the sudden transition from the picture of defeat and
flight of Israel in chapter x. to the calm assemblage of all Israel in
chapter xi., and how obscure the various references to David’s
earlier life in xi. 15 ff.!

we are thy bone and thy flesh] The phrase is not to be taken
strictly as implying kinship, for only the tribe of Judah could say “The
king is near of kin to us” (2 Samuel xix. 42). The other tribes mean
that they will obey David as though he were their own kin.

²In times past, even when Saul was king, it


was thou that leddest out and broughtest in
Israel: and the Lord thy God said unto thee,
Thou shalt feed my people Israel, and thou
shalt be prince ¹ over my people Israel.
¹ Or, leader.

2. the Lord thy God said] Compare verses 3, 10; 1 Samuel xvi.
1‒13.

prince] compare v. 2, note.

³So all the elders of Israel came to the king to


Hebron; and David made a covenant with
them in Hebron before the Lord; and they
anointed David king over Israel, according to
the word of the Lord by the hand of Samuel.
3. made a covenant] i.e. gave them a charter in which he
promised to respect existing rights; compare 1 Samuel x. 25
(Samuel writes the “manner” of the kingdom).

before the Lord] One method of entering into a covenant “before


the Lord” was to pass between the parts of a sacrificed animal;
compare Jeremiah xxxiv. 18, 19.
Chronicles has nothing here corresponding with 2 Samuel v. 4, 5;
but compare xxix. 27.

according ... Samuel] Compare 1 Samuel xv. 28, xvi. 1, 3.

4‒9 (= 2 Samuel v. 6‒10).


The “City of David” captured and made a royal residence.

⁴And David and all Israel went to Jerusalem


(the same is Jebus); and the Jebusites, the
inhabitants of the land, were there.
4. David and all Israel] In Samuel (more accurately) “The king
and his men,” i.e. his household and body-guard; compare x. 6, note.
The exploit recorded in Samuel is invested by the Chronicler with the
grandeur of a state campaign.

the same is Jebus] Jerusalem (or Jebus) consisted, it seems


(compare verse 8; Judges i. 21), of a citadel inhabited by Jebusites
and of a lower city inhabited by a mixed population of Jebusites and
Benjamites. It was the citadel only which David stormed. Jerusalem
is called Jebus only here and in Judges xix. 10 f. The notion that
Jebus was an ancient name for the city may be only a late fancy, but
it is possible that it was sometimes so called in the days of the
Jebusite rule. What is certain is that the name Jerusalem is ancient,
for it occurs frequently (as Urusalim) in the Amarna tablets, circa
1400 b.c. See G. A. Smith, Jerusalem.

the Jebusites, the inhabitants of the land] i.e. masters of that


territory before the Israelite invasion. The Jebusites have been
thought to be of Hittite origin, but they were probably Semites, like
the Israelites (see G. A. Smith, Jerusalem, ii. 16‒18).

⁵And the inhabitants of Jebus said to David,


Thou shalt not come in hither. Nevertheless

You might also like