Started in January,1974(Monthly)
Supervised and Sponsored by Chongqing Southwest Information Co., Ltd.
ISSN 1002-137X
CN 50-1075/TP
CODEN JKIEBK
Editors
Current Issue
Volume 43 Issue Z6, 14 November 2018
  
Research and Development on Structured Sparse Models and Algorithms
LIU Jian-wei, CUI Li-peng and LUO Xiong-lin
Computer Science. 2016, 43 (Z6): 1-16.  doi:10.11896/j.issn.1002-137X.2016.6A.001
Abstract PDF(3908KB) ( 840 )   
References | Related Articles | Metrics
The group sparse model has many important applications in the statistics,signal processing and machine learning.The group sparse model achieves feature group selection through introducing the sparsity-inducing penalty function into the objective function.It’s interesting that some group sparse models can achieve feature group selection and feature selection within groups simultaneously.According to the penalty functions,the sparse group models are mainly divided into two categories,i.e.,Group Lasso models and the group sparse models with non-convex penalty.This paper systematically summarized important group sparse models and analyzed the differences and relations between various group sparse models.In addition,we summarized and compared the statistical properties (such as model selection consistency,parameter estimation consistency and oracle property) and the solving algorithms for various group sparse models.Roughly speaking,the Group Lasso models include normal Group Lasso model,L∞,1 penalty Group Lasso model,overlapping Group Lasso model,tree guided Group Lasso model,multiple-output tree guided Group Lasso model,mixed Group Lasso model,adaptive Group Lasso model,logistic Group Lasso model and Bayesian Group Lasso model.Algorithms for solving group sparse model are composed of Group LARS,block coordinate descent method (block coordinate ascent method),active set method,interior point method,projected gradient method,spectral projected gradient method,alternating direction method of multipliers and block coordinate gradient descent method.We carried out a detailed analysis of these algorithms for specific group sparse models.Before using the optimization methods above,we must pretreat the objective function,i.e.,we must transform the nonsmooth,nonconvex and non-separable penalty function in the objective function of group sparse model into smooth,convex and separable functions.Variational inequalities,Nesterov’s smooth approximation techniques,local first-order approximation by Taylor series expansion,local quadratic approximation,the dual norm and dual function are often used in this step.Next,some group sparse mo-dels which are recently proposed were introduced,such as Group Lasso model based on generalized additive model,composite group bridge model,group square-root Lasso model,Group Lasso model based on Tobit model and so on.Finally,we talked the future research directions of the group sparse models.
New Development of Artificial Cognitive Computation:TrueNorth Neuron Chip
WANG Yu-chen and HU Hua
Computer Science. 2016, 43 (Z6): 17-20.  doi:10.11896/j.issn.1002-137X.2016.6A.002
Abstract PDF(1350KB) ( 1111 )   
References | Related Articles | Metrics
Inspired by the brain’s operating mechanism,artificial cognitive memory is a novel method based on neural networks and is integrated with self-learning and self-adaptive ability.In principle,the artificial cognitive memory can break the Von Neumann bottlenecks,and then significantly accelerate the information processing speed and reduce the power consumption.In this paper,the TrueNorth neuron chip developed by IBM Research was introduced in detail,including the fundamental architecture,the computing principle,the chip characteristics,and application results.In the end,the future development prospects of artificial cognitive memory were discussed.
Distributive Law in Deduction Mechanism of Logic
SHI Hang, WANG Bao-shan and WU Mei-hua
Computer Science. 2016, 43 (Z6): 21-24.  doi:10.11896/j.issn.1002-137X.2016.6A.003
Abstract PDF(1042KB) ( 683 )   
References | Related Articles | Metrics
It is well known that the distributive law plays a core role in deduction mechanism of classical logic.How-ever,distributive law is abandoned in quantum logic,so that the classic deduction mechanism disappears from quantum,which spontaneously arises the debate whether the quantum logic can be called “logic”? In this paper,we introduced the defects of using closed subspaces of Hilbert space to describe quantum logic and deeply analyzed the deduction mechanism in classical logic.Further,the deduction mechanism can be established in quantum logic by using the orthomodular law instead of distributive law.In particular,the deduction mechanism can be renewed with adjunctions in category,which is a generalization of deduction mechanism in classical logic.
Fault Detection for Beer Fermentation Process Based on Segmentation Multiway Kernel Principal Component Analysis
LV Ning, YAN Lu-qi and BAI Guang-yuan
Computer Science. 2016, 43 (Z6): 25-27.  doi:10.11896/j.issn.1002-137X.2016.6A.004
Abstract PDF(893KB) ( 547 )   
References | Related Articles | Metrics
The fault diagnosis model based on principal component analysis has limitation in nonlinear time varying process.Based on the characteristics of the batch process,we introduced the theory of kernel transformation into the data extraction of nonlinear space,and proposed an improved fault diagnosis model based on multiple kernel principal component analysis.This method shows good performance for the nonlinear problem of process data and the full extraction of nonlinear information,where the nonlinear principal element can be rapidly extracted in the high dimensional feature space.The method was tested by comparison.The results show that the method has good accuracy and real-time performance in the process of slow time varying batch process.
Research of Emergent Transportation Traffic Management Optimized Model Based on CDM Mode
LV Rong-sheng, FANG Jing and LIU Shan-shan
Computer Science. 2016, 43 (Z6): 28-33.  doi:10.11896/j.issn.1002-137X.2016.6A.005
Abstract PDF(1495KB) ( 611 )   
References | Related Articles | Metrics
Under the existing facilities of airdrome,it is an urgent problem that how to improve aid efficiency by air traffic management.The paper put forward an air traffic management optimized model based on CDM mode,disposed and optimized the air traffic in and out the airdrome,whose capacity is increased by proper diversion.The paper also proved its feasibility by simulation algorithm.
Implementation Method and System for EV Self-diagnosis
ZHANG Jun-hui, LI Qing and CHEN Da-peng
Computer Science. 2016, 43 (Z6): 34-36.  doi:10.11896/j.issn.1002-137X.2016.6A.006
Abstract PDF(987KB) ( 559 )   
References | Related Articles | Metrics
In the future,vehicles will be highly intelligent,informative and automotive green cars.Traditional OBD(On Board Diagnostics) solutions need to be analyzed by engineer themselves,which waste huge manpower and time.Our solution,by means of internet of vehicles technology,database and data mining,finally realizes the intelligent self-diagnosis of vehicle CAN (Controller Area Network).Also it supports statistics and classification,which makes engineers easily evaluate node equipment reliability,stability and anti-interference ability,as well as requirements for most suitable working environment.By the way,the product based on this solution has been put into practical service.
Solitary Pulmonary Nodules Classification Based on Genetic Algorithm and Back Propagation Neural Networks
HU Qiang, HAO Xiao-yan and LEI Lei
Computer Science. 2016, 43 (Z6): 37-39.  doi:10.11896/j.issn.1002-137X.2016.6A.007
Abstract PDF(948KB) ( 542 )   
References | Related Articles | Metrics
In order to improve the accuracy of benign and malignant diagnosis of the solitary pulmonary nodules in the computer aided diagnosis system,this paper proposed a novel classification algorithm based on genetic algorithm and back propagation neural networks.Considering the local optimum problem of the BP neural networks and the medical diagnosis features of solitary pulmonary nodules,the proposed algorithm uses genetic algorithm to optimize the classifier based on BP neural networks.Through the PET/CT image processing,the functional characteristics,structural characteristics and clinical information of the lesions are extracted as input samples of the neural network based classifier.Then,the benign and malignant diagnosis of the solitary pulmonary nodules is realized by the novel classifier.Classify experimental results on a large number of experiment data from a hospital and public databases on network show that the optimized algorithm is greatly improved on the classification accuracy,indicating that this method is effective in clini-cal classification of pulmonary nodules.
Attribute Reduction Based on Cost Minimization and Significance of Joint Attributes
XU Fei-fei, BI Zhong-qin and LEI Jing-sheng
Computer Science. 2016, 43 (Z6): 40-43.  doi:10.11896/j.issn.1002-137X.2016.6A.008
Abstract PDF(914KB) ( 483 )   
References | Related Articles | Metrics
The classical rough set attribute reduction is mainly based on maintaining positive region,boundary region and negative region unchanged.In the decision rough set model,the reduction procedure for adding or deleting an attri-bute is no longer monotonous,so that three regions can not keep all unchanged.In decision theoretic rough set model,decision making should take consideration of minimizing the cost.Therefore,this paper put forward a method for attribute reduction based on minimizing the cost,while considering the classification ability of selected attribute subset to the decision-making,which is named as the significance of joint attributes.Experiments show that our method is effective.
Intuitionistic Extension of Mamdani Fuzzy Reasoning Arithmetic
WANG Jian, SHI Zhao-hui, GUO Xin-peng and LI Wei-ping
Computer Science. 2016, 43 (Z6): 44-45.  doi:10.11896/j.issn.1002-137X.2016.6A.009
Abstract PDF(518KB) ( 551 )   
References | Related Articles | Metrics
Mamdani fuzzy reasoning arithmetic was intuitively extended in this paper.Firstly,the fuzzy relation (FR) of Rc which was defined by Mamdani was intuitively extended.Secondly,the intuitionistic fuzzy generalized modus ponens formulas and intuitionistic fuzzy generalized modus tollens formulas of IFR Rc were deduced.Finally,an instance was given to depict the detail of logic reasoning and computing and prove the validity of this method,and the performance of this method was evaluated by intuitionistic rules.
Research on Image Classification Algorithm Based on Semi-supervised Deep Belief Network
ZHU Chang-bao, CHENG Yong and GAO Qiang
Computer Science. 2016, 43 (Z6): 46-50.  doi:10.11896/j.issn.1002-137X.2016.6A.010
Abstract PDF(1195KB) ( 736 )   
References | Related Articles | Metrics
In recent years,the deep learning to get a successful application in image,voice,video and other unstructured data,has become a hot topic of machine learning and data mining.As a supervised learning model,the successful deep learning applications often require a larger set of high-quality training.Based on this situation,we studied deep belief network composed of more restricted Boltzmann machines,and combined with the thought of semi-supervised learning,we used smaller training set to improve the classification accuracy of depth network model.We used Knn,SVM and pHash three methods to study the non-labeled data set.And the result shows that the semi-supervised deep belief networks increases image classification accuracy by about 3% compared with the traditional network with more restricted Boltzmann machine.
Sequential Verification Algorithm to Compute Edit Distance Based on Edit Operation Sequence
ZHANG Run-liang and NIU Zhi-xian
Computer Science. 2016, 43 (Z6): 51-54.  doi:10.11896/j.issn.1002-137X.2016.6A.011
Abstract PDF(891KB) ( 567 )   
References | Related Articles | Metrics
The edit distance between two strings is the minimum number of edit operations required to transform one into another.The edit distance is widely used in approximate string match,string similarity joins and etc.Dynamic programming algorithm(DPA) uses an edit distance matrix to compute the edit distance between two strings,which needs to compute all the elements in the matrix and has poor time efficiency.The progressive method changes the calculation orders of the elements to reduce the calculation numbers,which still needs to compute half of the elements and whose time efficiency needs to be improved.In our paper,we proposed a sequential verification algorithm to compute the edit distance based on the edit operation sequence.First,we analyzed the enumerable nature of edit operation sequence and gave a way to enumerate the sequences.Then,we got the result though sequentially verifying the edit operation sequences ordered by its edit operation numbers until the verification being successful.Experiments on approximate string search with threshold 2 show that compared with the DPA,our method achieves high performance.
Modeling and Analysis of Avionics Configurations Control System Based on AADL
ZHOU De-xin, LI Ning and LIU Zhe-xu
Computer Science. 2016, 43 (Z6): 55-59.  doi:10.11896/j.issn.1002-137X.2016.6A.012
Abstract PDF(1335KB) ( 609 )   
References | Related Articles | Metrics
To verify the avionics system’s different configuration stimulations and the system’s remodification,and to solve the issue that configuration control system can change according to operational needs to ensure the system structure can be reliably reconstructed,the advanced AADL(architecture analysis and design language) was adopted to complete modeling design of avionics configuration control system.With AADL,the avionics configuration control system was analyzed in detail,its features and key techniques were revealed,and the system’s operational principle and internal structure were also reflected.This model shows the system’s functions and non-functional constraints,realizes the system-level modeling strategy and hierarchically describes the system’s structure.
Large Deviation Algorithm of Huge Ship Fluid Solid Coupling Deformation in Random Sea Wave Condition
CHEN Zhao, ZHANG She-sheng and LI Yu-guang
Computer Science. 2016, 43 (Z6): 60-63.  doi:10.11896/j.issn.1002-137X.2016.6A.013
Abstract PDF(1009KB) ( 534 )   
References | Related Articles | Metrics
It is of practical significance to study the long time deformation of large ship with carrier grade.In random sea state conditions,according to the wave and the hull of the fluid solid coupling interaction,the mathematical model of large deviation algorithm of the large ship deformation was built.The large deviation numerical principles of large ship deformation and absolute deformation were given.The ship deformation large deviation optimization algorithm and an example of calculation were shown.Our theory and results are useful for research on ship fatigue injury.
SIRS Epidemic Model and its Threshold Based on State Transition Probability
GU Hai-jun, JIANG Guo-ping and XIA Ling-ling
Computer Science. 2016, 43 (Z6): 64-67.  doi:10.11896/j.issn.1002-137X.2016.6A.014
Abstract PDF(956KB) ( 707 )   
References | Related Articles | Metrics
For SIRS (Susceptible-Infected-Removed-Susceptible) epidemic model,we used the method of state transition probability to study the SIRS epidemic process through calculating the probability in each state over time.First,we established state probability equations to describe the probability in susceptible state,infection state and immune state of each node at each moment.Then,we derived the epidemic threshold of SIRS model by the theory of steady state analysis.Finally,using the Monte Carlo method,we analyzed and simulated the epidemic threshold in both homogeneous network and heterogeneous network.Compared with the traditional mean-field method,the simulation results show that the threshold obtained by the state probability equations is much closer to real Monte Carlo value and has no relations with the immune deficiency rate.
Map Matching Algorithm Based on Conditional Random Fields and Low-sampling-rate Floating Car Data
YANG Xu-hua and PENG Peng
Computer Science. 2016, 43 (Z6): 68-72.  doi:10.11896/j.issn.1002-137X.2016.6A.015
Abstract PDF(1233KB) ( 625 )   
References | Related Articles | Metrics
In this paper,a new map matching algorithm(FB-MM) based on conditional random fields and low-sample-rate floating car data was proposed.On the basis of the road network model,the candidate projection points and their observation probability of the GPS observation point can be gained,and the candidate paths and the transfer probability between adjacent candidate projection points can be also gained.Then the probability weight value of every candidate projection points can be computed by using forward and backward recursion algorithm based on the conditional random fields in the sliding window.After that,the best matching projection point can be selected by the probability weight value.Based on low-sampling-rate floating car data,this map matching algorithm can make full use of the topological information of road network and the correlation information between GPS observation points.So it can achieve the better map matching effect.
Hamilton Descomposition of BSCC(4,k)
HU Yan-hong and SHI Hai-zhong
Computer Science. 2016, 43 (Z6): 73-76.  doi:10.11896/j.issn.1002-137X.2016.6A.016
Abstract PDF(961KB) ( 576 )   
References | Related Articles | Metrics
Bubble-sort connected cycle is an important class of interconnection network.Shi Hai-zhong conjectured that BSCC(n)(n≥4) is a union of edge disjoint a Hamiltonian cycle and a perfect matching.Denoting BSCC(n) as BSCC(n,0),we studied the new network BSCC(4,1) by using a triangle to replace per vertex of BSCC(4,0),and BSCC(4,2) was gotten by using a triangle to replace every vertex of BSCC(4,1).In the similar way,the new network was gotten by using a triangle to replace every vertex for k times.Shi Hai-zhong raised the conjecture 2 further more:BSCC(n,k) is a union of edge disjoint a Hamiltonian cycle and a perfect matching.In this paper,we proved that BSCC(4,k) is a union of edge disjoint a Hamiltonian cycle and a perfect matching.
Wavelet Kernel Extreme Learning Machine Algorithm Based on Detecting Particle Swarm Optimization
CHEN Xiao-qing, LU Hui-juan, GUAN Wei and ZHENG Wen-bin
Computer Science. 2016, 43 (Z6): 77-80.  doi:10.11896/j.issn.1002-137X.2016.6A.017
Abstract PDF(897KB) ( 562 )   
References | Related Articles | Metrics
In this paper,the principle of the kernel extreme machine was studied.Wavelet function was chosen to be the extreme learning machine’s kernel function.Experiments show that this algorithm improves the classification accuracy and increases the robustness.Based on this method,we used detecting particle swarm optimization(DPSO) to optimize and set the initial parameters of WKELM in order to obtain the optimal WKELM classifier DPSO-WKELM.We used UCI gene data for simulation.The classification results are compared with the results of radial basis kernel extreme learning machine (KELM) and WKELM.The comparison shows that the proposed algorithm has higher classification accuracy.
Attribute Reduction Algorithm for Incomplete Information Systems Based on Approximate Fuzzy Entropy
WANG Qiong-zhi, ZHENG Wen-xi and WANG Dao-ran
Computer Science. 2016, 43 (Z6): 81-82.  doi:10.11896/j.issn.1002-137X.2016.6A.018
Abstract PDF(673KB) ( 532 )   
References | Related Articles | Metrics
Attribute reduction is important research content of rough set theory.Attribute reduction based on information entropy is an effective method of knowledge reduction.In practical application,the acquired information system is usually not complete.To solve this problem,we defined a new knowledge entropy based on the relationship between the attribute subset redu and CAttr-redu,and proposed a new incomplete information system attribute reduction algorithm (newS algorithm) applying approximate fuzzy entropy.Finally,simulation experiment was carried out on 6 data sets in ROSE and UCI data.The experimental results show that the newS algorithm is feasible,and has higher efficiency compared with other algorithms under the same reduction effect.
Conservative Extensions in Description Logic εVL
NIE Deng-guo, YU Quan, ZHANG Wei and SHEN Yu-ming
Computer Science. 2016, 43 (Z6): 83-86.  doi:10.11896/j.issn.1002-137X.2016.6A.019
Abstract PDF(1211KB) ( 540 )   
References | Related Articles | Metrics
In fact ontology is definitely the structured knowledge base in description logic.As we know,knowledge is not always the same,so it needs to be extended as long as new improvement appears in this field.It is concerned that whether it is consistent with the primitive one after extension.The conservative extension of εVL system was analyzed based on Lutzs’ work.Firstly the εVL canonical model was constructed and the inclusion inference was reduced to the simulations between two εVL canonical model.The complexity was pointed out to be polynomial based on that the canonical models’ largest simulation is polynomial.After that the εVL conservative extension algorithm was presented and its complexity was proved to be exponential.
Research on Bus Arrival Time Prediction Model Based on Fuzzy Neural Network with Genetic Algorithm
LUO Pin-jie, WEN He and WAN Li
Computer Science. 2016, 43 (Z6): 87-89.  doi:10.11896/j.issn.1002-137X.2016.6A.020
Abstract PDF(978KB) ( 581 )   
References | Related Articles | Metrics
Because the arrival time prediction of public transit is influenced by many factors and all kinds factors on the prediction accuracy can’t be measured,it is difficult to use the traditional mathematical model to solve this problem.In this paper,a fuzzy neural network model based on genetic algorithm was used to predict the arrival time of the bus.In this model,genetic algorithm and fuzzy inference system are integrated into the multi-layer feed forward neural network.And the initial value of each parameter of the network is initialized and updated by the membership degree of fuzzy rules.At the same time,multi-population adaptive genetic algorithm for macro searches is used to improve the network optimization ability.The paper used a bus line in Chengdu city running time prediction as an example to make the simulation.The simulation results show that,the fuzzy neural network based on genetic algorithm of bus arrival time prediction model has higher accuracy and reliability.
Improved Genetic Algorithm for Traveling Salesman Problem
WEN Yi and PAN Da-zhi
Computer Science. 2016, 43 (Z6): 90-92.  doi:10.11896/j.issn.1002-137X.2016.6A.021
Abstract PDF(657KB) ( 746 )   
References | Related Articles | Metrics
Traveling salesman problem(TSP) is a typical combination optimization problem,which is also a NP hard problem.It’s hard to find a precision result,and it is very important to search for the near result.Based on basic genetic algorithm in solving TSP problems of slow convergence speed and easy to premature,an improved crossover ope-rator and the updating strategy based on similarity of a population was put forward in this paper.The improved crossover operator exchanges the two cities’ number by comparing the distance between the two cities,thus accelerates the convergence speed.And the updating strategy based on population can effectively prevent the population from premature in the later stages of the algorithm.Espcially,for the CHN144,the best path it finds is better than the basic one.
New Method of Determining Evidences Weight Coefficient
XU Jiang-jun
Computer Science. 2016, 43 (Z6): 93-94.  doi:10.11896/j.issn.1002-137X.2016.6A.022
Abstract PDF(415KB) ( 474 )   
References | Related Articles | Metrics
We analyzed the defects of the existing way to determine the weight of evidence,and gave a new modified method based on the present method.It calculated the evidence primary weight via similarity between evidence and ave-rage evidence.When the weight value is less than the threshold value,the evidence is removed from evidence group,and the weight of remaining evidence is calculated.By K iterations,until all the evidence weight is higher than the threshold value,arithmetic average value of K evidence weights is calculated.Finally,D-S combination rules is used to combine the weighted average evidence.An example indicates that the new method can give the more reasonable combination results compared with D-S combination rule and other modified methods.
Optimization Method of Support Domain Radius of Moving Least Squares Agent Model
LENG Ya-hong
Computer Science. 2016, 43 (Z6): 95-98.  doi:10.11896/j.issn.1002-137X.2016.6A.023
Abstract PDF(954KB) ( 648 )   
References | Related Articles | Metrics
The moving least squares agent model is better than the general agent model,but its accuracy is affected by the radius of the support domain.On the basis of empirical formula,this paper proposed an optimization method for the support domain radius of the moving least square agent model.The optimal radius of sampling points in the support domain is obtained,and the approximation accuracy is improved to achieve the purpose of reducing the sampling points.Numerical experiments show that,for different base order function and weight function,the proposed method greatly improves the approximation accuracy of the moving least square agent model,and compared with the moving least squares (MLS) agent model based on empirical formula, the same approximation accuracy can be reached with only a few sampling points.
Uncertainty Reasoning Based on Related Planar Cloud and Application in Prediction
LIU De-hao and WANG Qian
Computer Science. 2016, 43 (Z6): 99-102.  doi:10.11896/j.issn.1002-137X.2016.6A.024
Abstract PDF(883KB) ( 495 )   
References | Related Articles | Metrics
Qualitative reasoning based on cloud model implements uncertainty reasoning by constructing cloud regulation generators.But the traditional two-dimensional cloud reasoning methods do not take notice of the interaction between the condition clouds,as an improvement on the traditional method ,a new approach to planar cloud reasoning method based on related condition clouds were suggested in this paper,which was followed by the focus on the product of cloud droplets and the construction of corresponded regulation generators.Then this new approach was applied in the prediction of GDP of America.The prediction results show that the new approach outperforms the traditional one,which indicates that the new method is feasible and effective,and to some extent remedies the shortcomings of traditional way.
Review of Gestures Recognition Based on Vision
YI Jing-guo, CHENG Jiang-hua and KU Xi-shu
Computer Science. 2016, 43 (Z6): 103-108.  doi:10.11896/j.issn.1002-137X.2016.6A.025
Abstract PDF(1651KB) ( 1336 )   
References | Related Articles | Metrics
In recent years,with the development of computer vision and the requirement of human computer interaction,gesture recognition research has made conspicuous progress.However,there are few little review research to gesture recognition with analysis,summarizing and evaluation.In this paper,in connection with summarizing the research and development status of the computer vision gesture recognition,we analyzed several gesture recognition technologies based on compute vision in the past 30 years.The steps of hand gesture recognition were proposed from three main steps:detection and segmentation,analysis and recognition.We summarized their application fields and their advantages and disadvantages,and then made a conclusion and the future scope.
Survey on Image Inverse Halftoning and its Quality Evaluation
QU Xing-xing, ZHANG Fan, LIU Bin and ZHANG Ruo-ya
Computer Science. 2016, 43 (Z6): 109-115.  doi:10.11896/j.issn.1002-137X.2016.6A.026
Abstract PDF(1723KB) ( 514 )   
References | Related Articles | Metrics
Digital halftoning is one of the core technologies in the digital printing and modern printing.Halftoning can be used to convert continuous tone images into binary images.Inverse halftoning is essential in the processing of classification recognition,image compression and image enhancement based on halftoning images.In the paper,different deve-lopment stages of halftoning images were introduced above all.And then current inverse halftoning methods were classified by different image restoration gist.Next,objective evaluation about the quality of reconstructed inverse halftoning images were carried out.Lastly the deficiency of current inverse halftoning methods were summarized,along with the key directions can be improved in the future.
Using Contour Edge Curve’s Fourier Transform Method to Faster Match W-Shaped Pattern
WANG Yang and ZHANG Qin
Computer Science. 2016, 43 (Z6): 116-117.  doi:10.11896/j.issn.1002-137X.2016.6A.027
Abstract PDF(699KB) ( 622 )   
References | Related Articles | Metrics
The “W” pattern target is a kind of four-wheel alignment’s calibration target.It is very important to search the right contour in a positioning image quickly.The Hu moment and the pair-wise geometric histograms algorithm are traditional methods,and they are low in speed and poor in robustness.Some Fourier descriptor for trademark image retrieval can be well applied to quick contour matching.But searching a target contour in an image may introduce too much interference of many contours.In this paper,we used the Fourier transform to extract the contour’s spectrum characteristics,then measured the similarity by examining the disruptive of target contour to the template contour,and simplified the algorithm to achieve the goal of fast matching.Its anti-interference ability is strong,and searching efficiency is high.This suggests that the method of Fourier transform of edge curve to match contour has good applicability.
Frequency Domain Information Based Water Body Image Retrieval in High Resolution Satellite Image Databases
LI Yi-kun, HU Yu-xi and YANG Ping
Computer Science. 2016, 43 (Z6): 118-121.  doi:10.11896/j.issn.1002-137X.2016.6A.028
Abstract PDF(1065KB) ( 556 )   
References | Related Articles | Metrics
The core of a remote sensing image retrieval system is to quickly and accurately find target images from remote sensing image database.Since integrated region matching (IRM) algorithm can effectively reduce the retrieval mistakes which are caused by inaccurate segmentation of images,this paper used IRM as image similarity measurement standard,and proposed a novel approach to retrieve water body images from remote sensing image database,which sorts the retrieved images in ascending order according to their average high frequency strength (AHFSS).Additionally,this paper used three different types of filters in frequency domain to obtain the AHFSS values of images and conduct retrieval experiments.The experimental results show that the proposed approach increases the retrieval precision by 18% and the ideal high frequency filter is the optimal filter.Therefore,the proposed approach has higher retrieval precision and retrieval efficiency to meet users’ requirement.
Image Fusion Algorithm Using LP Transformation and PCNN-SML
WANG Quan, NIE Ren-can, JIN Xin, ZHOU Dong-ming, HE Kang-jian and YU Jie-fu
Computer Science. 2016, 43 (Z6): 122-124.  doi:10.11896/j.issn.1002-137X.2016.6A.029
Abstract PDF(721KB) ( 618 )   
References | Related Articles | Metrics
Using Laplace pyramid algorithm (LP) and pulse coupled neural network (PCNN),this paper proposed an effective fusion algorithm of the multi-focus image.First,the paper used Laplace pyramid to do multi-scale decomposition of the image,and the decomposition images were processed by PCNN,thus the corresponding neuron ignition frequency map was obtained.Then the paper calculated local entropy for every pixel’s ignition frequency map,and took local sum of modified Laplacian(SML) as a measure of the quality of the pixels for the source image fusion.Finally,the paper used the Laplace pyramid reconstruction algorithm to generate the fused image.Experimental results indicate that proposed method is effective and better than other traditional fusion algorithms.
Adaptive Hand Tracking Based on Fusing Depth and Skin Features
NIU Chen-xiao, SUN Jin and DING Yong-hui
Computer Science. 2016, 43 (Z6): 125-129.  doi:10.11896/j.issn.1002-137X.2016.6A.030
Abstract PDF(1476KB) ( 449 )   
References | Related Articles | Metrics
Hand tracking is key to realize natural human-computer interaction.The existing hand tracking algorithms for human-computer interaction can be easily affected by illumination and have poor robustness.In this paper,an adaptive hand tracking by the fusion of depth and skin color features was put forward.Considering the deformation of hand,the algorithm uses the depth threshold to achieve adaptive changes of tracking region with the continuity and smoothness of depth feature and obtains the hand alternative regions.Then,via skin color feature of YCbCr space,the skin normalized histograms were established to describe the alternative regions.Under the framework of particle filter,the hand tracking is converted into Bayes estimation problem and the algorithm determines the hand position based on the maximum a posterior (MAP).The variance of particle weights was monitored to solve the problem of tracking failure.Experimental results show that,the proposed algorithm can achieve accurate and robust tracking performance in real-time with complex background.
Optimization Method of Seabed Sediment Texture Feature Based on Genetic Algorithm
LI Wen-li, GAO Hong-wei, JI Da-xiong and LI Yan
Computer Science. 2016, 43 (Z6): 130-133.  doi:10.11896/j.issn.1002-137X.2016.6A.031
Abstract PDF(1228KB) ( 484 )   
References | Related Articles | Metrics
In order to improve autonomous sensing perception of underwater vehicle on classification of seabed sediments and solve the problem of features redundancy,using genetic algorithm to optimize texture features of seabed sediments was studied.In the background of the classification and identification of seabed sediment based on a variety of seabed sediment visual texture features that are extracted based on gray level co-occurrence matrix and fractal theory,the reduction of feature dimension has been realized by using the genetic algorithm to optimize the texture features,and the texture features after dimension reduction are trained by a self-organizing mapping neural network as inputs for vi-sual classification of seabed sediments,improving the environmental awareness of underwater vehicle on underwater operation.The experimental results show that with respect to the texture features that are not optimized,optimized texture features have better classification effect in seabed sediment classification and recognition.
Sketch Recognition Method of Combined Graphs for Conceptual Design
JI Hai-feng and TIAN Huai-wen
Computer Science. 2016, 43 (Z6): 134-138.  doi:10.11896/j.issn.1002-137X.2016.6A.032
Abstract PDF(1168KB) ( 522 )   
References | Related Articles | Metrics
This article introduced a sketch recognition method for conceptual design.The graphic symbol was regarded as composition of multiple single stroke graphs.First,the types of individual primitives were identified and the spatial relationship between them was extracted.Then the structure of standard graph was defined by the method of structure matching.A comparison between the graphics drawn by user and the standard template was made to filter out the graphs with the same structure.Then Fourier shape descriptor was used to describe graphic shapes to calculate the shape similarity,and filter out the graphs with the same shape.By combining the two features of the structure and shape,we could quickly and accurately identify the motion mechanism,extract engineering semantics and realize the capture of design intent.
Using Variable Value Coding to Show 4 Classical Classification Models of Cellular Automata
ZENG Ping-an and ZHENG Zhi-jie
Computer Science. 2016, 43 (Z6): 139-141.  doi:10.11896/j.issn.1002-137X.2016.6A.033
Abstract PDF(676KB) ( 486 )   
References | Related Articles | Metrics
The classification of the rules of the cellular automaton is a classical topic of the study on the whole feature of Boolean functions.Since 1984,4 kinds of classical classification have been proposed by Wolfram.The two dimensional encoding of 256 functions in classical classification model doesn’t show obvious distribution regularity in the condition of ordinal arrangement.We used the encoding mode of variable value logic system to show the pattern of existing cellular automaton classification,and used several types of variable value logic coding to show the result of classical classification by arranging the 256 functions in the 16*16 matrix image.In this mode,the variable value encoding mode shows different characteristics of symmetry.At last the different encoding sequences and encoding arrangement were listed.
Research on Three-dimensional Tree Model Simplification for Web Applications
DONG Tian-yang, YAO Jia-jie and JI Lei
Computer Science. 2016, 43 (Z6): 142-148.  doi:10.11896/j.issn.1002-137X.2016.6A.034
Abstract PDF(2129KB) ( 518 )   
References | Related Articles | Metrics
Construction and transmission of 3D tree models under network environments require not only fine model fidelity,but also high efficiency of real-time interaction.Because of the existing geometry-based or image-based 3D model simplifications unable to satisfy the model details and storage capacity requirements of 3D tree models,a novel simplification method of 3D tree model for web applications was proposed.This method extracts the skeleton nodes of branches of tree model,and simplifies braches by using non-uniform skeleton simplification method.Furthermore,it simplifies crowns of tree models by using crown textures and reconstructs them with convex hull based texture mapping,so as to improve the similarity of simplified 3D tree models.The applications show that this method can reduce storage capacity of model files,and improve the transmission efficiency of network on a basis of excellent visual quality.
Fingerprint Enhancement Based on Straight-curved Line Gabor Filter
MEI Yuan, ZHAO Bo and ZHU Zhi-dan
Computer Science. 2016, 43 (Z6): 149-151.  doi:10.11896/j.issn.1002-137X.2016.6A.035
Abstract PDF(971KB) ( 573 )   
References | Related Articles | Metrics
Fingerprint enhancement has a great significance on improvement of the accuracy of automatic fingerprint identification System(AFIS).To balance the weakness of straight-line Gabor filter in extremely curved region and the high computation complexity of curved line Gabor filter,a novel straight-curved line Gabor filter was proposed in this paper.First,orientation coherence was utilized to divide the fingerprint into curved and smooth regions.Then implemented curved-line filtering and straight-line filtering on curved and smooth regions respectively.Experiments on FVC2002 show that the proposed method achieves comparable enhancement effect and 1.404 times faster calculating speed,compared with curved-line Gabor.
Fast Image Registration Algorithm for PCB Images
WANG Dong, MA Chun-yong and CHEN Ge
Computer Science. 2016, 43 (Z6): 152-155.  doi:10.11896/j.issn.1002-137X.2016.6A.036
Abstract PDF(992KB) ( 798 )   
References | Related Articles | Metrics
PCB(Printed Circuit Board) image registration is a key step in automatic optical inspection.There are many similar graphics and area in PCB.The general method of extracting and matching of feature points is low efficiency and easy to produce false match.A fast registration algorithm based on the similarity triangle constraint was proposed,which takes specific geometric center as the feature points.The sets of centers of round and square were Delaunay triangulated.Similar triangles were found in two triangulation networks and implemented another subdivision and comparison to enhance reliability.Experiments show that the method is rigid and has higher accuracy,which can get the correct matching point of uniform distribution.
New Image Text Detection Method Based on Double-threshold Gradient Pattern
CAI Wen-zhe, WANG Bin-jun and LI Pei-yue
Computer Science. 2016, 43 (Z6): 156-164.  doi:10.11896/j.issn.1002-137X.2016.6A.037
Abstract PDF(2200KB) ( 541 )   
References | Related Articles | Metrics
This paper studied the traditional image text detection approaches and proposed a new image text detection method based on double-threshold gradient pattern with a pretty fast speed in both classifier training and implementing.Firstly,in the rough detection phase,the maximally stable extremal regions(MSER) was extracted as a candidate text area,to avoid scanning the whole image,greatly improving the detection speed and real-time.Secondly,in the feature extraction part of refine detection phase,in order to overcome the text area color contrast inversion problem and the problem of noise in natural image,this paper creatively presented a dual threshold gradient mode feature to describe the texture of the text area feature.Finally,to design the detector for text fine detection,this paper designed a new Cascade ELM(Extreme Learning Machine) detector by limit learning machine,which greatly shortens the classifier training time.The experimental results show that this method not only has excellent detection performance,but also greatly shortens the classifier training time and testing time.
SIFT Feature Extraction Parallel Algorithm on Mobile Device
GAN Wei, ZHANG Su-wen, LEI Zhen and LI Yi-fan
Computer Science. 2016, 43 (Z6): 165-167.  doi:10.11896/j.issn.1002-137X.2016.6A.038
Abstract PDF(810KB) ( 598 )   
References | Related Articles | Metrics
Feature extraction and matching is an important part in computervision applications,such as image matching,object recognition and video tracking.SIFT algorithm is widely used in the field of image registration because of its scale invariance and rotation invariance.To deal with the low efficiency of the traditional SIFT algorithm,we proposed an efficient method which is implemented on mobile platform.In this paper,we used the OpenCL to achieve the SIFT algorithm on mobile device,through redistributing the calculation of tasks and optimizing the SIFT algorithm in the mobile OpenCL parallel implementation.Experimental results show that our SIFT algorithm takes full advantage of the GPU parallel computing power,greatly improving the efficiency of the SIFT algorithm,achieving the efficient feature extraction.
Study on Image Segmentation Based on RGB Color Image
MO Ling
Computer Science. 2016, 43 (Z6): 168-170.  doi:10.11896/j.issn.1002-137X.2016.6A.039
Abstract PDF(756KB) ( 565 )   
References | Related Articles | Metrics
Image segmentation is a major problem in image processing,the effect of image segmentation directly affects the results of image analysis.Color image segmentation is to segment the color image into regions with different characteristics and to extract the target of interest,which lays the foundation for the subsequent image processing.For color image gradient map of watershed segmentation,it will cause the over segmentation problem,through comparing with various image segmentation methods(threshold segmentation,Ostu segmentation,maximum entropy segmentation),a color image segmentation method of maximum entropy based on genetic algorithm was proposed.The test results show that the image segmentation algorithm can effectively segment the color image.
Visual Saliency Based Multi-view Video Coding Algorithm
LUO Xiao-lin and LUO Lei
Computer Science. 2016, 43 (Z6): 171-174.  doi:10.11896/j.issn.1002-137X.2016.6A.040
Abstract PDF(1146KB) ( 481 )   
References | Related Articles | Metrics
Towards the compression issue for multi-view video coding,a visual saliency based coding algorithm was proposed.Based on the fact that the human eyes are more sensitive to the distortion of the salient regions,the proposed method employed the controlling of the coding quality of the salient and non-salient regions to improve the multi-view video coding efficiency.First,pixel based visual saliency map of the multi-view video was extracted by the color and motion information fusion based video saliency filter.Then,the visual saliency map was transformed to the saliency representation of the coding macroblock.Finally,through the principle of the perceptual video coding,the coding quality of each macroblock was adaptively controlled based on its saliency representation.The experimental results demonstrate that the proposed method can effectively improve the rate-distortion performance and the subjective quality of the multi-view video coding.
Visualization Research of Point Cloud Data in 3D Laser Scanning
XU Xu-dong and LI Ze
Computer Science. 2016, 43 (Z6): 175-178.  doi:10.11896/j.issn.1002-137X.2016.6A.041
Abstract PDF(997KB) ( 716 )   
References | Related Articles | Metrics
D laser scanning can obtain a lot of point cloud data,the display speed of which is directly affected by their structure.After the research,a spatial index structure mixing the octree and the leaf node of K-D tree,as well as a level detail (LOD) model is an efficient method to solve the problem of the low efficiency in managing and visualizing the point cloud data.Globally,quick indexing and management can be realized by the octree model.Locally,efficient query and display can be realized by the K-D tree constructed in memory.This mixed data model is adopted to organize point cloud,establish spatial index,and construct point cloud data by LOD,thereby realizing the index and visualization of point cloud data.
Image Copy-Paste Tampering Detection Based on Improved SIFT Algorithm
LI Kun-lun and SUN Shuo
Computer Science. 2016, 43 (Z6): 179-183.  doi:10.11896/j.issn.1002-137X.2016.6A.042
Abstract PDF(1144KB) ( 761 )   
References | Related Articles | Metrics
Copy-paste is a technique widespreadly used in image tampering,and it is one of the most covert tampering means.SIFT is a kind of common matching algorithm,it is also a more effective test method for the copy-paste tampering images.But it has the problems of poor matching accuracy,high time-complexity and so on.In order to overcome the problems,some improvements were made in this paper.The threshold value is determined by the method of fitting optimization to solve the problem of accuracy when the threshold is increased,and methods of extracting the feature points by SIFT algorithm are improved.The BBF search algorithm based on K-D tree is adopted to achieve the fast matching of the nearest neighbor query,and the feature matching is improved in SIFT algorithm,so the problem of high time complexity goes easy.The experimental results show that the proposed algorithms are effective.
Novel PCB Defect Detection Based on Morphology Image Process
WANG Dong and XIE Ze-xiao
Computer Science. 2016, 43 (Z6): 184-186.  doi:10.11896/j.issn.1002-137X.2016.6A.043
Abstract PDF(929KB) ( 638 )   
References | Related Articles | Metrics
The automatic PCB optical defect detection algorithm based on mathematical morphology was promoted.The edge is extracted from referential image after erode operation.If the edge is used as the ROI for the distance transform of test image,each point on the edge will have the corresponding distance to the border of the track.Threshold of the distance image can rapidly detect defects of PCBs.Combined with the comparison of contour feature,the algorithm can accurately identify the types of defect.Experiments show that the algorithm can quickly detect all kinds of defects in PCB images,and can make accurate automatic classification and recognition.
IMU Based Construction of Sign Language Skeleton Model
SUN Xie and CHEN Xi
Computer Science. 2016, 43 (Z6): 187-190.  doi:10.11896/j.issn.1002-137X.2016.6A.044
Abstract PDF(1108KB) ( 699 )   
References | Related Articles | Metrics
Sign language recognition belongs to the research category of gesture recognition.The traditional method of sign language recognition based on data glove can not capture all the elements of sign language,as it can not recognize sign language movements cooperated by hand and limbs.For IMU (Inertial Measurement Unit),due to its small size,low cost,it has been applied to the motion capture project widely.Based on the knowledge of robot kinematics,we proposed an IMU based language recognition skeleton model,which is in line with the biological characteristics of human body.The first construction procedure of the model is to make a selection of the skeleton,and then tocalibrate the size of the skeleton.Finally,an experimental method was proposed to calibrate the model size,which can be solved by using the data of the action set obtained by IMU.
Adaptive Image Segmentation Using Affinity Propagation Clustering
DAI Shan and LI Guang-jun
Computer Science. 2016, 43 (Z6): 191-193.  doi:10.11896/j.issn.1002-137X.2016.6A.045
Abstract PDF(726KB) ( 490 )   
References | Related Articles | Metrics
This paper presented a unified approach for automatic image segmentation.In order to segment the image into homogenous regions,a two-stage method was proposed.Firstly,an improved simple linear iterative clustering method is adopted for the over-segmentation of the image.Then,color moments of each local region are computed to represent the region,and the affinity propagation clustering is adopted to merge the regions which are segmented in the first stage.Numerous experiments were conducted on public available datasets to demonstrate the effectiveness and robustness of the proposed algorithm.
3D-surface Reconstruction Algorithm for Medical Images Based on MITK
ZHOU Juan
Computer Science. 2016, 43 (Z6): 194-197.  doi:10.11896/j.issn.1002-137X.2016.6A.046
Abstract PDF(1037KB) ( 597 )   
References | Related Articles | Metrics
Studying on MITK and based on its design criteria of integrated framework,data model and algorithm model,we implemented Marching Cubes surface drawing algorithm based on volume pixel of medical image sequence,and then showed the reconstructed surface model by 3D-visualization.The contour plane constructed by this algorithm can bring a sharp image of the interested contour plane,although can not reflect the general view and detail of the whole raw data,and can implement drawn function and real time interactive operation based on in-being image hardware implementation.
Monocular Vision Alignment Algorithm Based on ORB
ZHU Yong-feng, ZHU Shu-long, ZHANG Jing-jing and ZHU Yong-kang
Computer Science. 2016, 43 (Z6): 198-202.  doi:10.11896/j.issn.1002-137X.2016.6A.047
Abstract PDF(1307KB) ( 528 )   
References | Related Articles | Metrics
We proposed a monocular visual location algorithm which is faster,higher precise,and more robustness than current state-of-the-art methods in a wide range,repeated,high-frequency texture features (for example,cement,lawn) scenes.We extracted ORB (oriented fast and rotated brief) feature of sequences,matched point matches of images by using KNN,then computed the fundamental matrix F and the essential matrix E and used them to compute the initial camera pose.Then we computed the rotation matrix R and the translation vector t for every pair of images,and computed the camera motion between the current and the previous image,triangulating 3-D points (structure) from 2-D image correspondences,tracking the trajectory of camera.In order to improve our algorithm,we refined the pose and structure through minimizing the reprojection error.A test was made on our datasets which has been implemented in OpenCV/C++.The results is much better than tradition algorithm.Since it is a monocular implementation,we cannot do absolute scale estimation.
Moving Object Tracking Based on Five Frame Difference and Improved Meanshift Algorithm
CHEN Shuang-ye and WANG Shan-xi
Computer Science. 2016, 43 (Z6): 203-206.  doi:10.11896/j.issn.1002-137X.2016.6A.048
Abstract PDF(887KB) ( 627 )   
References | Related Articles | Metrics
To improve the traditional frame difference of moving object detection method in which the hole edge tend to appear and to reduce the disadvantages of tracking object losing because traditional Meanshift algorithm used in video monitoring is easy to be disturbed by background,almost leading to the failture of tracking,a new method was proposed by employing dynamic threshold in the five frame difference method to detect the moving object and a improved Meanshift algorithm was designed to realize object tracking location by background-weighting and template updating.The new method improves the real-time performance and robustness of tracking.The results show that the method is feasible with advantages of detecting the moving object accurately,and it also can improve reliability of object tracking.
Study on Pedestrian Detection Based on Sparse Representation and Machine Learning
WANG Jian and LAN Tian
Computer Science. 2016, 43 (Z6): 207-209.  doi:10.11896/j.issn.1002-137X.2016.6A.049
Abstract PDF(710KB) ( 478 )   
References | Related Articles | Metrics
According to the application of pedestrian detection technology in the intelligent transportation system,in order to improve the efficiency,real-time and accuracy of pedestrian detection method,in this paper,the sparse representation was applied to the feature compression of the image,and a new method of pedestrian detection based on HOG and LTP feature training SVM classifier was proposed.Training SVM classifier for pedestrian detection based on the cha-racteristics of HOG and LTP effectively combines the image gradient feature and texture features and takes advantage of the sparse expression on data compression which can effectively speed up the algorithm.Experimental results show that the proposed algorithm has the advantages of high precision and speed.
Detection of Driver’s Head-dipping Based on Computer Vision
YANG Xiao-feng, DENG Hong-xia and LI Hai-fang
Computer Science. 2016, 43 (Z6): 210-213.  doi:10.11896/j.issn.1002-137X.2016.6A.050
Abstract PDF(867KB) ( 523 )   
References | Related Articles | Metrics
In order to monitor and alert the distraction by using a mobile phone during driving,the method based on the facial feature extraction to detect the driver’s head behavior was proposed.This method uses ASM (Active Shape Mo-del) to obtain the facial feature points,calculates the head posture description on the position information of the face feature points,and draws the head posture classified by SVM from the above information finally.Experimental results show that the method can effectively detect the driver’s head-dipping during driving,and the average detection rate is above 94%.
Total Variance with High-order Coupling Term for Color Image Restoration
MA Hong-hua, HUANG Yong-lin and DING Yan-yan
Computer Science. 2016, 43 (Z6): 214-216.  doi:10.11896/j.issn.1002-137X.2016.6A.051
Abstract PDF(724KB) ( 501 )   
References | Related Articles | Metrics
A new total variance for color image restoration method was proposed.To overcome blocky effect produced by anisotropic diffusion of TV model,high-order term was added to TV model.In the process of color image restoration,multi-channel coupled mechanism was used to realize mutual constraints between different monochrome channels.The new model is able to preserve edges because of the characteristic of anisotropic diffusion.The experimental results show that the images processed by the proposed model have higher PSNR (Peak Signal to Noise Ratio) than these processed by other models,and the non-boundary region looks more natural.
One-class Information Extraction from Remote Sensing Imagery Based on Nearest Neighbor Rule
BO Shu-kui and JING Yong-ju
Computer Science. 2016, 43 (Z6): 217-218.  doi:10.11896/j.issn.1002-137X.2016.6A.052
Abstract PDF(761KB) ( 486 )   
References | Related Articles | Metrics
One-class extraction from remote sensing imagery is a special method of classification,where users are only interested in recognizing one specific land type.The extraction of a specific class was studied based on nearest neighbor rule in this paper.Two aspects were considered,class partitioning and sample selection for each class.Firstly,the effect of data distribution partitioning is analyzed theoretically based on nearest neighbor in one-class classification.It is confirmed that the nearest neighbor classifier requires the data distribution to be partitioned into only two classes,namely the class of interest and the remainder.Secondly,as a two-class problem,the classification process was simplified,and the sample selection in nearest neighbor classification was performed in terms of both the spatial and the feature space.The experiments show that the specific class of interest can be well extracted from the remote sensing image with the proposed method.
Research on Single Nucleotide Polymorphism Encoding in Disease Association Studies
ZHAO Jing, WEI Bin and ZHANG Jin
Computer Science. 2016, 43 (Z6): 219-221.  doi:10.11896/j.issn.1002-137X.2016.6A.053
Abstract PDF(944KB) ( 562 )   
References | Related Articles | Metrics
Due to the SNP has some characteristics (such as high abundance and low mutation rate),they are suitable for disease association studies.Lots of those studies were based on calculated methods,so encoding the SNP to enhance the performance of disease associated analysis algorithm was critical aspect.However,few of studies were dedicated to that issue.Therefore,based on common SNP encoding method and association between them,we proposed several new encoding methods.The experiments results show that encoding methods has a greater impact on algorithm performance,and the methods described herein are better than others.Namely,the encoding methods proposed in this paper are better to describe the SNP sequence and retain the original biological sequence information,and are more suitable for disease susceptibility research.
Detection and Localization of Insulator Defects in Aerial Images
FANG Ting and HAN Jia-ming
Computer Science. 2016, 43 (Z6): 222-225.  doi:10.11896/j.issn.1002-137X.2016.6A.054
Abstract PDF(912KB) ( 557 )   
References | Related Articles | Metrics
The insulator is easy to result in self-shattering fault if exposed in the natural environment for long time.For this situation,this paper presented an image processing method for insulator defect detection and location.Based on the pattern characteristics of insulator in aerial image,the between-cluster variance method and median filtering were used to preprocess the image.The ant colony algorithm with optimized parameters obtained by particle swarm optimization algorithm was proposed to detect the contour and number of insulator in the aerial image.At last,the location of defective insulator was marked in the original image.This method has good effect for defective insulator detection under simple background,and establishes pre-research foundation to realize insulator defect detection and localization under complex background.
Non-rigid Point Set Registration Algorithm Based on Iteration
ZHOU Hong-yu, YANG Yang and ZHANG Su
Computer Science. 2016, 43 (Z6): 226-231.  doi:10.11896/j.issn.1002-137X.2016.6A.055
Abstract PDF(1260KB) ( 491 )   
References | Related Articles | Metrics
We proposed a non-rigid point set registration algorithm.It uses a robustly global and local multi-feature for corrspendence estimating,and combined with the Gaussian mixture model for transformation updating.Firstly,to mea-sure global and local structural diversities,we introduced two distance features,among two point sets,respectively.Then,the two features formed a multi-feature based cost matrix.It provides a flexible approach to estimate correspondences by minimizing the global or local structural diversities.Finally,we designed a Gaussian mixture model based energy function for refining the transformation updating,and it was minimized by the L2 distance minimization.By contour registration,sequence and real images,we tested the performance of the algorithm and compared against four state-of-the-art methods.This algorithm shows the best alignments in all most of the experiments.
Application Research of Skeleton Extraction Algorithm Based on Image Processing
DIAO Zhi-hua, WU Bei-bei, WU Yuan-yuan and WEI Yu-quan
Computer Science. 2016, 43 (Z6): 232-235.  doi:10.11896/j.issn.1002-137X.2016.6A.056
Abstract PDF(1029KB) ( 603 )   
References | Related Articles | Metrics
The skeleton is an important transform in image analysis and shape description,which is ubiquitous and an important topological structure that is difficult to describe in image geometry.Skeleton extraction technology in image processing has been the focus of scholars’ attention.Based on extensive literature research,the extraction methods of skeleton were reviewed, its applications in agriculture were summarized,and the skeleton extraction technique used in other aspects were introduced.Finally,we indicated the main problems and the trends of the skeleton extraction algorithm,and carried on the summary and outlook to provide reference to the development and related research of the field.
Recognition Algorithm of Outlier and Boundary Points Based on Relative Density
LI Guang-xing
Computer Science. 2016, 43 (Z6): 236-238.  doi:10.11896/j.issn.1002-137X.2016.6A.057
Abstract PDF(960KB) ( 485 )   
References | Related Articles | Metrics
According to the fact that outlier points are the data that are inconsistent with most of data in a data set,and that boundary points are located on the edge of data area with different densities,an algorithm based on relative density was proposed to determine the outlier and boundary points.Through dividing the neighborhood area,which is centered by this point with a radius of r,into two semi-neighborhood areas,and determining this data point’s isolation level and boundary level based on the relative density of these semi-neighborhood areas with the original neighborhood area,a final judgment whether a data point is boundary or outlier point can be made according to the threshold value.Experimental results indicate that this algorithm can effectively and accurately identify the outlier and boundary points from multi-density data sets.
Multi Angle Acquisition and 3D Observation of Insect Specimen Image
LIU Gui-yang, GUO Xin-tong, XI Gui-qing and LIU Jin-ming
Computer Science. 2016, 43 (Z6): 239-241.  doi:10.11896/j.issn.1002-137X.2016.6A.058
Abstract PDF(709KB) ( 542 )   
References | Related Articles | Metrics
Because current insect electron specimen is too simple and can not observe clearly from multi angle,a HD original image library system of insect specimens was established,which is based on the PC software that controls servo motor,SCM senting out pulse signal to rotate insect specimens and macro camera to auto focus and shoot with 360 degrees.This system implements the function of insect 3D image model observation with multi angle by using the methods that include thumbnail panorama display,HD images dynamic loading,and capturing and processing user messages.The system has the characteristics of sample volume collection,real-time 3D observation and high definition display details,and provides data support for teaching and distinguishing of insects.
Network Coding Based Topology Inference:A Survey
XU Jing, LIU Yan-tao, XIA Gui-yang and Yasser MORGAN
Computer Science. 2016, 43 (Z6): 242-248.  doi:10.11896/j.issn.1002-137X.2016.6A.059
Abstract PDF(2033KB) ( 530 )   
References | Related Articles | Metrics
Topology structure is one of the important parameters in a network.Acquiring network topology is a very meaningful fundamental problem,especially for the purpose of monitoring and managing a network.The birth of network coding offers new ideas and methods for network topology inference.There exist close relationships introduced by network coding operations of network nodes between encoded data and network topology,which can be further exploited to acquire network topologies.Compared to network tomography based methods,the network coding based topology inference outperforms them in many aspects,such as improving inference accuracy,lowering algorithm complexity,etc.This paper investigated the state-of-the-art studies of network coding on topology inference.
Research Survey of Virtual Machine Placement Problem
TONG Jun-jie, HE Gang and FU Gang
Computer Science. 2016, 43 (Z6): 249-254.  doi:10.11896/j.issn.1002-137X.2016.6A.060
Abstract PDF(1448KB) ( 645 )   
References | Related Articles | Metrics
With increasing number and scale of the cloud computing data centers,which generally adopt virtualization technologies,virtual machine placement problem is becoming a hot topic in both industry and academy areas.The choice of policies and methods on virtualization machine placement impacts the energy consumption of data centers,the utilization of resources and the performance of virtual machines.Proper policies and methods protect the up layer applications and service from being affected with lowering energy consumption,increasing resources utilization and decreasing resources wasting.This paper descripted three essentials of virtual machine placement problem including optimization function,constraint conditions and methodology depending on the existing research works.This paper also presented a summary of the current works and some crucial problems which should be solved pressingly.
High-power Broadcasting Based Routing Scheme for Delay Tolerant Mobile Sensor Networks
YANG Kui-wu
Computer Science. 2016, 43 (Z6): 255-259.  doi:10.11896/j.issn.1002-137X.2016.6A.061
Abstract PDF(1199KB) ( 466 )   
References | Related Articles | Metrics
This paper proposed a high-power broadcasting based routing scheme (HBR) for delay tolerant mobile sensor networks.Using broadcasting information of base station on the frequency of f1,sensor nodes not only can remove the redundant messages,but also can compute their own delivery probabilities which are the basis for messages forwar-ding between nodes on the frequency of f2.In buffer management,HBR employs the message with short survival time and forwarding domain M to manage the message queues.Simulation results show that HBR scheme has a higher message delivery ratio and less delivery delay than some classic DTMSN routing schemes and its overhead of communicating is reasonable.
Wireless Body Area Network Routing Protocol Based on Max-Min Model
LI Yan, FENG Xian-ju, CHEN Zhuo, ZHOU Yi and WANG Bin
Computer Science. 2016, 43 (Z6): 260-264.  doi:10.11896/j.issn.1002-137X.2016.6A.062
Abstract PDF(1211KB) ( 530 )   
References | Related Articles | Metrics
Due to the limited energy of nodes in wireless body area network (WBAN),an energy efficient routing protocol was proposed based on multipath routing mechanism and Max-Min model,which makes the residual energy of the nodes in the routing process as large as possible,thus balancing the energy consumption of nodes in the network,and prolonging the network lifetime.The routing protocols are simulated and compared with the routing protocols based on Min model.The result shows that the routing protocol based on Max-Min model can better balance the energy consumption of nodes and prolong the lifetime of network.
Initial Separating-matrix Optimized Online Blind Source Separation Algorithm
YANG Hua, ZHANG Hang, ZHANG Jiang, YANG Liu and LI Jiong
Computer Science. 2016, 43 (Z6): 265-267.  doi:10.11896/j.issn.1002-137X.2016.6A.063
Abstract PDF(997KB) ( 518 )   
References | Related Articles | Metrics
Aiming at the problem that the convergence speed of online BSS algorithms is effected by the initial separating matrix,a new initial separating matrix optimized online BSS algorithm based on ABC algorithm was proposed.The new algorithm utilizes the strong search ability of batching ABC algorithm to get an optimized initial separating matrix that makes the BSS algorithms have a better initial iteration point,and then uses the gradient descent to achieve online separation,which can improve the overall convergence speed of BSS algorithm.Simulation results confirm that the proposed algorithm increases the convergence speed effectively.And in the condition of time-varying mixing matrix the new algorithm is applicable.
Game-based Routing Selection and Trust Decisions for DTMSN
CUI Ping-fu, REN Zhi and CAO Jian-ling
Computer Science. 2016, 43 (Z6): 268-271.  doi:10.11896/j.issn.1002-137X.2016.6A.064
Abstract PDF(805KB) ( 552 )   
References | Related Articles | Metrics
To deal with the inaccurate claculation problem and energy waste caused by the node can not forward the data for the reason of itself,which has not been thought by the previous detection of selfishness for delay tolerant mobile sensor networks,this paper introduced the concepts of the remaining energy and the flag that can or not forward the packages to improve the previous.Also the mechanism of punishment is used to avoid the sham information of energy and flag.Both of all make the value of credit better.The thought of game is also used in this paper to stimulate the cooperation in the good condition,the theoretical also say the same.For the reason of game,the node can cooperate with each other more,increasing the initiative of forwarding date,which can improve the success rate and reduce the overhead of network.
Distributed Channel Allocation Algorithm for Multi-services Systems
LI Xiang-yang, ZHAO Hang-sheng, ZHAO Xiao-long and ZHANG Yang
Computer Science. 2016, 43 (Z6): 272-275.  doi:10.11896/j.issn.1002-137X.2016.6A.065
Abstract PDF(1281KB) ( 608 )   
References | Related Articles | Metrics
As one of the key issues for cognitive radio,channel allocation has been widely studied in recent years.The centralized decision algorithms are common,but it is uneasy to implement centralized algorithms for distributed cognitive radio systems.Based on the college admission matching theory proposed by D.Gale and L.Shapley and characteristics of distributed systems,the channel association of multi-services systems was formulated as a many-to-one matching game.And a distributed channel allocation algorithm based on utility matrix was proposed.Simulations show that the convergence time of the proposed algorithm is short and the utility of stable matching state approximates the optimal centralized algorithm in Rayleigh fading,far better than that of the random access algorithm.
Thinking of Constructing High-performance Simulation Platform for Large-scale Network
DU Jing, WANG Qiong, QIN Fu-tong and LIU Ying-long
Computer Science. 2016, 43 (Z6): 276-280.  doi:10.11896/j.issn.1002-137X.2016.6A.066
Abstract PDF(1291KB) ( 599 )   
References | Related Articles | Metrics
Along with the flying development of network technology,network scale expands rapidly and network topo-logy grows increasingly complicated.It becomes an urgent problem for realizing large-scale network simulation to design a support platform for large-scale network high-performance simulation.However,the understanding of large-scale network simulation platform is not enough in depth and comprehensiveness,and existing simulation platforms still cannot effectively meet large-scale network’s new requirements of flexible expansion,flexible reconstruction,efficient running and service objected.It needs to re-understand large-scale network simulation platform from different angles and multiple aspects for more scientific platform construction thoughts.Therefore,this paper used multiple new perspectives to reexamine and deeply realize large-scale network simulation platform,and went into the problem of large-scale network simulation which needs to be solved in environment generating and running.We mainly analyzed large-scale network simulation features and platform requirements,then put forward four thinking models in network simulation platform construction and analyzed the application of models in network simulation platform construction.The research production can provide theoretical basis and effective measures for realizing large-scale network high-performance simulation.
Analyzing Source Code of 802.11 Physical Layer Implementation in NS-3
WANG Yue
Computer Science. 2016, 43 (Z6): 281-284.  doi:10.11896/j.issn.1002-137X.2016.6A.067
Abstract PDF(1358KB) ( 792 )   
References | Related Articles | Metrics
NS-3 is an important network simulator,offering a lower level abstraction of wireless functionality than NS-2 and is more close to realistic wireless physical layer.In this work,we read the source code and analyzed simulation mechanisms of 802.11 physical layer,including node states and the condition that packets can be received,the computation of the starting and ending time of channel busy state,packet reception power considering the addition effect of multiple path loss and fading models,the computation of bit error rate and packet error rate,and tracking of multiple interfering packets and the computation of interference based on chunk units.We gave some advices for protocol modification.This paper makes a contribution to understand the wireless simulation principle of NS-3.
Simulation and Analysis of AODV Protocol in Fishing Marine VHF Ad Hoc Network
SHEN Dan-dan, WANG Li-hua, WANG Yu and WANG Zhen-zhou
Computer Science. 2016, 43 (Z6): 285-287.  doi:10.11896/j.issn.1002-137X.2016.6A.068
Abstract PDF(785KB) ( 517 )   
References | Related Articles | Metrics
In order to verify the feasibility of AODV protocol in fishing marine VHF Ad Hoc network and research its working performance,the simulation soft tool OPNET was used in the article,and the packet delivering rate,average delay,normalized routing load and mean hops were chosen to evaluate the performance of the AODV.Then the effects of these parameters on the performance were analyzed by changing the network size and the mobile speed of the node.The simulation result indicates that the AODV is suitable to the small scale fishing marine VHF Ad Hoc network in which the speed of boat is within 10m/s.The paper summaried the problems of AODV protocol and gave some advice to improve protocol working performance.
Algorithm of Opportunistic Routing Based on Energy Harvesting Wireless Sensor Networks
TIAN Xian-zhong and XIAO Yun
Computer Science. 2016, 43 (Z6): 288-290.  doi:10.11896/j.issn.1002-137X.2016.6A.069
Abstract PDF(1022KB) ( 443 )   
References | Related Articles | Metrics
In wireless sensor networks,using energy harvesting technology can theoretically extend infinitely the life of the nodes.Based on this technology,this paper presented a new opportunistic routing algorithm called energy potential opportunistic routing (EPOR) algorithm.First,the algorithm uses expected transmission count between the transmission node and the destination node to measure the distance between them.Then,the sum of the residual energy and the harvested energy of the node is used to express the energy potential of the node.Finally,the expected transmission count and the potential energy of the node are used to determine the back off time of the node.The node which has the shortest back off time is the transmission node.Theoretical analysis and simulation results show that this algorithm can not only prolong the life of the network,but also significantly improve the energy balance of the nodes in the network.
Algorithm of Wireless Sensor Network Routing Based on Energy Aware
LI Xiang and SUN Hua-zhi
Computer Science. 2016, 43 (Z6): 291-294.  doi:10.11896/j.issn.1002-137X.2016.6A.070
Abstract PDF(917KB) ( 496 )   
References | Related Articles | Metrics
Confronted with the problems of low success rate of data collection,load imbalance between nodes and link estimate complexity,an energy-aware routing protocol (EARP) was proposed.EALB adopts energy-aware mechanism as well as data transmission between brother nodes,and uses RSSI to estimate link quality instead of success rate of data transmission.The results show that,in the same simulation environment,compared with LEPS routing protocol,the proposed routing protocol performs more effectively on balancing the traffic load,protecting low energy nodes and prolonging the network lifetime.
Probability Routing Algorithm in DTN Based on Time and Space and Sociality
JIA Jian-xin,  LIU Guang-zhong and XU Ming
Computer Science. 2016, 43 (Z6): 295-300.  doi:10.11896/j.issn.1002-137X.2016.6A.071
Abstract PDF(1782KB) ( 534 )   
References | Related Articles | Metrics
Aiming to improve the delivery ratio,reduce the delivery latency and reduce network overhead,probability routing algorithm was proposed based on time and space and social,which is called GTSP.Firstly,according to the time span and geographical area that node always has a big probability to encounter each other,the big encounter probability node table of each node and the common shared friends table between nodes in a specific time span and geographical area are ensured.Then the node uses GTSP routing algorithm to motivate and forward data packet,and it avoids the node mobile in erroneous time span and geographical area.Compared with the Prophet routing algorithm、the SprayAndWait routing algorithm and SimBet routing algorithm,the simulation results show that the GTSP has a better performance in delivery latency、delivery ration and network overhead.
Advanced TCPW Algorithm over Satellite Networks
YU Ran, ZHANG Dong and ZOU Qi-jie
Computer Science. 2016, 43 (Z6): 301-305.  doi:10.11896/j.issn.1002-137X.2016.6A.072
Abstract PDF(1180KB) ( 520 )   
References | Related Articles | Metrics
Aimming at the dramatic change of round trip delay caused by high dynamic changes of satellite network’s communication path and the alignment precision degradation when TCPW is used in long-delay network,a advanced TCPW algorithm named TCPW-CC was proposed,which uses the congestion coefficient of satellites as the adjusted basis of congestion windows.The new algorithm is able to avoid the influence of transmitting delay of space link.At the same time,the window adjustability is changed to be done after each RTT,which used to happen after dropping,thus resulting in that window grows less aggressive than the original.The results of simulation indicate that the TCPW-CC obviously improves the system throughput,decreases the rate of lost packets.
Evaluation Model of Cloud Computing Resources Dynamic Usability Based on User Behavior Feature
XU Pu-le, WANG Yang, HUANG Ya-kun, HAN Wen-kai and ZHAO Chuan-xin
Computer Science. 2016, 43 (Z6): 306-309.  doi:10.11896/j.issn.1002-137X.2016.6A.073
Abstract PDF(938KB) ( 490 )   
References | Related Articles | Metrics
According to the characteristics of cloud computing resources that they are usually centralized and consistent in quality,we proposed an evaluation model based on the concept of dynamic reference.We firstly determined the original reference and gave the resources an original evaluation utilizing the quality of sources,quality of service and characteristics of resources.Then we employed the difference of dynamic and refreshing threshold to adjust the reference into dynamic reference,which consequently enables us to reevaluate the resources.At last,we proposed the characteristic coefficient to further enhance the evaluation based on users’ behaviors.
Research of Network Resource Allocation Technology in Heterogeneous Multinetwork Environment
ZHU Wen-hong, REN Hai-jun, WU Liang-jun, LV Lin-jie and WANG Bo
Computer Science. 2016, 43 (Z6): 310-313.  doi:10.11896/j.issn.1002-137X.2016.6A.074
Abstract PDF(989KB) ( 547 )   
References | Related Articles | Metrics
With the rapid development of modern social economy,the contradiction between the development of wireless communication technology,wireless resources and business services has become more and more intensified.Many types of access technologies have formed a heterogeneous network environment.Based on this,the resource allocation techno-logy is the most important technology in the current stage.Because of the application of the property information under the condition of multi RATs,this paper mainly focused on the network selection algorithm,which will be used as the central content.
Link Prediction of AS Level Internet Based on Association Rule of Frequent Closed Graphs
ZHANG Yan-qing, LU Yu-liang and YANG Guo-zheng
Computer Science. 2016, 43 (Z6): 314-318.  doi:10.11896/j.issn.1002-137X.2016.6A.075
Abstract PDF(1274KB) ( 516 )   
References | Related Articles | Metrics
The existing link prediction methods are mostly focused on structure link prediction like missing links,but few are about temporal link prediction according to unknown links in future,therefore a link prediction method based on association rules of frequent closed graphs was proposed.Dynamic networks are divided into training set and test test,and frequent closed subgraphs are extracted from training set based on Apriori algorithm,thus time-lag distribution matrix is built to represent the temporal association rules between frequent closed graphs,and then the structure in test set is predicted.The link prediction method was used in the dynamic networks of AS level Internet at different time scales,and experimental results show that this method can efficiently predict links in wavery dynamic networks with high precision.
Research on Resource Allocation Based on Noncooperation Game for OFDMA-WLAN System
YANG Fan, ZHANG Xiao-song and MING Yong
Computer Science. 2016, 43 (Z6): 319-321.  doi:10.11896/j.issn.1002-137X.2016.6A.076
Abstract PDF(1044KB) ( 548 )   
References | Related Articles | Metrics
To satisfy different communication requirements of multiple users in the orthogonal frequency division multiple access (OFDMA) wireless local area network (WLAN) system downlink transmission,a resource allocation algorithm based on noncooperation game was presented.In this paper game theory is used as an efficient tool to study resource allocation in WLAN with quality-of-service (QoS),and different channel quality requirements are converted into multiple users’ noncooperation game problems for different channels resource allocation.The Nash equilibrium problem (NEP) is divided into sub-problems about variational inequality (VI).The sub-problems are solved by convex optimization function.The numerical analysis results show that the proposed algorithm in this paper is better in a trade-off among fairness about resource allocation and data transmission rate.
Research on Comprehensive Assessment Method of Information System Security Based on System Attack and Defense
WAN Xue-lian and ZHANG Jing-he
Computer Science. 2016, 43 (Z6): 322-327.  doi:10.11896/j.issn.1002-137X.2016.6A.077
Abstract PDF(1406KB) ( 537 )   
References | Related Articles | Metrics
This paper studied a comprehensive assessment method of information system security according to the two perspectives of system attack and defense.From the perspective of system attack,we proposed a quantitative assessment of system vulnerability risk assessment model based on CVE standard,association rules algorithm and vulnerability connection network model.From the perspective of system defense,we established assessment information system customi-zed security model which simplies evaluating information system assets.Combining the system information and security,we put forward that “information” is the smallest unit of information system,and established the AHP and three-dimensional assessment model according to GB/T 22239-2008.On the basis of system attack and defense assessment results,comprehensive assessment method realizes comprehensive system security quantitative analysis,system vulnerability risk analysis and system security short plate analysis.The application example shows that the comprehensive assessment method realizes objective,scientific and comprehensive quantitative assessment on information system security.
Findding XSS Vulnerabilities Based on Fuzzing Test and Genetic Algorithm
CHENG Cheng and ZHOU Yan-hui
Computer Science. 2016, 43 (Z6): 328-331.  doi:10.11896/j.issn.1002-137X.2016.6A.078
Abstract PDF(1262KB) ( 608 )   
References | Related Articles | Metrics
To solve the Web application cross-site scripting problem,on the base of the current study of various XSS vulnerabilities mining methods,a new optimization generation method of XSS attack sample was proposed .It mines the vulnerability based on fuzzy testing and genetic algorithms by analyzing features of XSS vulnerabilities ,ways of site filtering,and methods of distortion optimization.First,XSS vulnerability database was constructed.Second,the fuzzing testing was used to randomly pre-generate a lot of XSS attack test cases.Then,filter and complementation principle was taken to analyze,select and extract XSS’s attack features.Finally,genetic algorithms was applied to search for XSS attack features space to generate optimal XSS attack features test case through many iterations.Analysis shows that the method can effectively detect XSS vulnerabilities in Web applications.
Method of Duplicate Removal on Alert Logs Based on Attributes Hashing
HU Qian, LUO Jun-yong, YIN Mei-juan and QU Xiao-mei
Computer Science. 2016, 43 (Z6): 332-334.  doi:10.11896/j.issn.1002-137X.2016.6A.079
Abstract PDF(930KB) ( 723 )   
References | Related Articles | Metrics
Alarm logs generated by network security equipment have a large number of repeated alarms,which impact real-time network situational threat analysis.In order to solve real-time accurate de-duplication problem of alarm logs,we proposed a method of duplicate removal on alert logs based on attributes hash.The method uses attribute hash for duplicate alarms quick detection and uses the hash table to solve the storage problem of a large number of non-repeating alarm logs at the same time.Conducted experiments results in the alarm log based on Darpa data set show that the method ensures lower time complexity,while deduplication accuracy rate can reach 95%.
Hybrid Intrusion Detection Model Based on Evolutionary Neural Network
QU Hong-chun and WANG Shuai
Computer Science. 2016, 43 (Z6): 335-338.  doi:10.11896/j.issn.1002-137X.2016.6A.080
Abstract PDF(932KB) ( 610 )   
References | Related Articles | Metrics
In order to improve the detection rate of the intrusion detection system and reduce the false alarm rate,the misuse detection technology and anomaly detection technology were combined to overcome the single technical defect,and the improved evolutionary neural network was taken as a detection engine.Firstly,the genetic algorithm was improved to overcome the defect of the real-code poor global optimization,reduce the complexity of computation,and improve the speed of genetic algorithm evolutionary convergence.The combination of improved genetic algorithm and BP neural network LM algorithm further overcome the defects of slow training and being easy to fall into local optimum in the learning phase of neural network.Thereby,the capabilities of the neural network classification and pattern recognition increase.Using KDDCUP99 dataset as training and test data sets,experimental results show that the intrusion detection hybrid model based on evolutionary neural network can achieve significant improvement in the extraction speed of data feature rules,detection accuracy and recognizing new types of attacks.
Research on Identity Authentication Technology in Cloud Computing
ZHOU Chang-chun, TIAN Xiao-li, ZHANG Ning, YANG Yun-jun and LI Duo
Computer Science. 2016, 43 (Z6): 339-341.  doi:10.11896/j.issn.1002-137X.2016.6A.081
Abstract PDF(1039KB) ( 577 )   
References | Related Articles | Metrics
For the security authentication between the user in question of cloud platform,based on the openstack cloud platform architecture,security authentication keystone components,the identity authentication of the main security issues in cloud computing and the current cloud environments mainstream identity authentication technology,aiming at the mechanism for unified authentication and unified identity authentication technology platform under the cloud of vulnerability,this article analyzed the work principle of OpenID authentication, presented currently existing OpenID security problems,and obtained some improvements.Finally on the basis of the OpenID improvement techniques,the identity authentication technology was realized on the openstack platform.
EGAKA:An Efficient Group Authentication and Key Agreement Protocol for MTC in LTE-A Network
SONG Ya-peng and CHEN Xin
Computer Science. 2016, 43 (Z6): 342-347.  doi:10.11896/j.issn.1002-137X.2016.6A.082
Abstract PDF(1000KB) ( 658 )   
References | Related Articles | Metrics
Machine type communication (MTC),as the basis of the Internet of things,is a wide open area in market and a great application trend.The MTC networks can be strongly supported by the LTE-A networks,and the 3rd Generation Partnership Project (3GPP) has formally defined the MTC in the standard of Release 10.Compared to the normal mobile user equipment,the MTC devices have some special features,such as the huger quantity and lower power consumption.These features lead to more research challenges for the identity authentication in the LTE-A networks.When a mass of MTC devices are accessed to the LTE-A network simultaneously with a full authentication and key agreement process for each device,the communication signaling would congest the network.Meanwhile,the limited computation resources in MTC devices do not allow too many operations.Aimed at the congestion problems in the authentication processes,an authentication and key agreement protocol based on the aggregated proxy signature and message authentication code was proposed and named as EGAKA.The protocol adopts the aggregated proxy signature to make the LTE-A networks able to authenticate multiple MTC devices simultaneously and minimize the communication consumption.And the adoption of the message authentication code can decrease the computation consumption of the key agreement process.Then,the protocol was modeled and analyzed by the colored Petri nets (CPN),whose results demonstrate that the protocol is safe.Finally,via the performance analysis,the results demonstrate that the communication consumption is better than other protocols of the same kind,and the computation consumption is better than other protocols of the same kind which adopt the asymmetric encryption.
Research on Rootkit Detection System Architecture Based on Functional Separation in Virtualized Environment
ZHU Zhi-qiang, ZHAO Zhi-yuan, SUN Lei and YANG Jie
Computer Science. 2016, 43 (Z6): 348-352.  doi:10.11896/j.issn.1002-137X.2016.6A.083
Abstract PDF(1328KB) ( 547 )   
References | Related Articles | Metrics
A kind of Rootkit detection system architecture XenMatrix based on duty separation in virtualization environment was proposed in light of the problems of Rootkit detection technology being easy to be avoided and large perfor-mance overhead in existing virtualization environment,which can improve the security of its own and at the same time ensure the transparency of the detecting system.A strategy of adaptive adjustment to detect the frequency was proposed,which can achieve dynamic adjustment of Rootkit detecting frequency and reduce the overhead of the system effectively.The analysis of experimental results shows that this prototype system can effectively detect known and unknown Rootkit and has higher success rate of detecting and lower performance overhead compared to existing detecting technology at present.
Integrity-checking Security Data Aggregation Protocol
LIU Huai-jin, CHEN Yong-hong, TIAN Hui, WANG Tian and CAI Yi-qiao
Computer Science. 2016, 43 (Z6): 353-356.  doi:10.11896/j.issn.1002-137X.2016.6A.084
Abstract PDF(1042KB) ( 675 )   
References | Related Articles | Metrics
In wireless sensor network (WSN),how to aggregate data transmission and protect data privacy and integrity is an important challenge in current Internet of things applications.Ozdemir put forward PRDA (Polynomial Regression-based Secure Data Aggregation) protocol based on clustering and properties by using Polynomial of aggregated data privacy protection,but it is unable to validate data integrity.Because aggregated data of PRDA agreement may have been tampered with or counterfeit,this paper proposed a security data aggregation protocol called iPRDA which can detect the data integrity.It uses polynomial functions and data perturbation technology to protect data privacy,and detects data integrity using the link between the data features in base stations.Experiments show that this scheme under the condition of not affecting the data confidentiality,data integrity can be test effectively.
Attribute-based Encryption Scheme with Outsourcing Decryption Method
DING Xiao-hong, QIN Jing-yuan and WANG Xin
Computer Science. 2016, 43 (Z6): 357-360.  doi:10.11896/j.issn.1002-137X.2016.6A.085
Abstract PDF(879KB) ( 574 )   
References | Related Articles | Metrics
Sahai and Waters proposed ABE (Attribute-based Encryption) which realizes one to more encryption.ABE has extensive application value.With the continuous development of cloud computing,cloud computing is also closely linked with attribute based encryption.This paper applied the rapid and efficient outsourcing computation technique into ABE decryption algorithm.In cloud computing,the cloud server stores the clients’ ciphertexts.When the client decrypts its remote data,the decryptor uploads the transform key to the cloud server.By using the transform key,the cloud server transforms the ciphertext into semi-ciphertext and sends it to client.The client decrypts the semi-ciphertext and gets the plaintext.This outsourcing computation technique saves the clients’ computation overhead.Through security analysis and efficiency analysis,our proposed scheme is secure and efficient.
Decision Tree Algorithm in Non-invasive Monitoring Cell Phone Traffic
YI Jun-kai, LI Zheng-dong and LI Hui
Computer Science. 2016, 43 (Z6): 361-364.  doi:10.11896/j.issn.1002-137X.2016.6A.086
Abstract PDF(969KB) ( 477 )   
References | Related Articles | Metrics
The purpose of this paper is to solve the problem that it is difficult to monitor and identify the malicious software in the cell phone.Aiming at proposing and realizing a mobile traffic monitoring system,we used non-invasive method to get phone traffic data.A decision tree model was established using the ID3 algorithm for the data,and then the traffic data was classified according to the decision tree rule.The experiment results show that the method can identify the traffic flow generated by the cell phone and the recognition accuracy rate can reach more than 92%.
Research on Safety Method of Wearable Medical Devices
ZHANG Cai-xia and WANG Xiang-dong
Computer Science. 2016, 43 (Z6): 365-369.  doi:10.11896/j.issn.1002-137X.2016.6A.087
Abstract PDF(1277KB) ( 487 )   
References | Related Articles | Metrics
This paper researched on the security issues of wearable medical device,analyzed the strengths and weaknesses of biological key and quantum key,and gave an idea to protect wearable medical devices using both biological key and quantum key.Mereover,this paper researched on the security transmission of heterogeneous network composed by wearable medical devices.Based on the analysis of existing key pre-distribution scheme,we proposed to apply it to dynamic and heterogeneous network, to provide theoretical and technical basis for addressing the secure transmission of data.
Multiple-replica Provable Data Possession Based on Paillier Encryption
WANG Hui-qing and ZHOU Lei
Computer Science. 2016, 43 (Z6): 370-373.  doi:10.11896/j.issn.1002-137X.2016.6A.088
Abstract PDF(1230KB) ( 501 )   
References | Related Articles | Metrics
In cloud storage service,the user data are stored in untrusted cloud storage server and faced with security threat.In order to check whether all the file replicas are stored by the CSP intactly,a multiple-replica provable data possession scheme based on Paillier encryption and supporting the dynamic operation of data replica was proposed,namely the DMR-PDP scheme.To realize multiple-replica check,the file blocks are stored in the cloud server in the form of co-pies,and differentiable replicas are generated by using Paillier encryption system to encrypt the concatenation of the serial numbers of replicas and the file.The verifying tags are generated by BLS signature,which can batch checking of all replicas.The information of file identification and block position are added into the block tags to prevent both of the replacing and replay attacks from the CSP.The security analysis and simulation results show that the scheme is better than other literature methods in terms of security,communications and computational overhead,greatly improves the efficiency of file storage and validation,and reduces the computational overhead.
Decision Tree Algorithms for Big Data Analysis
ZHANG Yan and CAO Jian
Computer Science. 2016, 43 (Z6): 374-379.  doi:10.11896/j.issn.1002-137X.2016.6A.089
Abstract PDF(1786KB) ( 620 )   
References | Related Articles | Metrics
As a predictive model in machine learning,decision tree algorithm is widely used in various fields and has been a hot research topic due to its easily understandable result.Since the speed of data generation increases explosively,conventional decision tree algorithms cannot deal well with large datasets due to the limitation of memory capacity and processor speed,and thus novel implementation is needed urgently.Based on classic implementation and optimization method of decision tree,and challenges brought by big data,we compared various kinds of corresponding algorithms.Platforms for big data algorithms were then introduced and possible directions in the future were discussed in the end.
Survey of Clustering Algorithms for Big Data
HAI Mo
Computer Science. 2016, 43 (Z6): 380-383.  doi:10.11896/j.issn.1002-137X.2016.6A.090
Abstract PDF(972KB) ( 815 )   
References | Related Articles | Metrics
With the rapid increase of data size,it is a challenge to cluster the large scale data.Clustering algorithms for big data are very important for the stock investment analysis in the traditional finance field,customer segmentation in Internet finance field and so on.Firstly,the existing clustering algorithms for big data were divided,and then the advantages and disadvantages of each type were compared.After that,the problems of the existing researches were summarized.Finally,the future research directions were given.
Summary of Research on Website Structure Optimization Based on User Behaviour Analysis
LI Hui, TANG Meng and CHEN Hao
Computer Science. 2016, 43 (Z6): 384-386.  doi:10.11896/j.issn.1002-137X.2016.6A.091
Abstract PDF(946KB) ( 552 )   
References | Related Articles | Metrics
Website structure optimization based on the analysis of user behaviours has been regarded as the major research direction of Web mining.In accordance with making generalizations and summaries to literature at home and abroad,we reviewed the present situations on the research of website structure optimization based on the analysis of userbehaviours.Furthermore,by the contrast analysis,we indicated the merit and demerit of research methodology respectively.Moreover,discussion on research directions in future was included as a consequence.
Study on Sentiment Analyzing of Internet Commodities Review Based on Word2vec
HUANG Ren and ZHANG Wei
Computer Science. 2016, 43 (Z6): 387-389.  doi:10.11896/j.issn.1002-137X.2016.6A.092
Abstract PDF(732KB) ( 655 )   
References | Related Articles | Metrics
With the rapid development of e-commence under the network environment,product review has become an important data source for enterprises to improve quality and enhance service.The review comprises user’s emotional tendency in all aspects of the product.Emotional analysis can not only help business to understand the advantages and disadvantages of the product,but also provide data support for the potential consumer’s purchase decision.This paper presented a novel method to cluster commodity attribute based on combination neural network and computd sentiment of internet commodities review using word2vec.This essay computed the semantic similarity and built emotional dictionary based on word2vec,then used the emotional dictionary to obtain the emotional tendencies of the test texts.The effectiveness and accuracy of the method is validated through experiments.
Application of Genetic Algorithm on Optimal Sequence of College Entrance Examination Voluntary Report
YANG Bo-kai, LI Xiao-yu, HUANG Yi-ming and LEI Hang
Computer Science. 2016, 43 (Z6): 390-394.  doi:10.11896/j.issn.1002-137X.2016.6A.093
Abstract PDF(1150KB) ( 827 )   
References | Related Articles | Metrics
We proposed a method according to genetic algorithm (GA),which aims at finding the best and optimal plan for the college entrance examination voluntary report.The project simulates the process of natural selection and genetic evolution,ranking college aspirations of different examinees,so that they will have maximum benefits.Under the condition that the amount of selectable universities is the same,the sequences of different examinees’ data tend to be stable by using the procedure to iterate and optimize intelligently.These sequences are stable and optimal,which can meet the practical needs of the examinees and achieve the goal of maximizing the benefits.The method adoptes the data from ten universities including 985,211,and common colleges to test and record.The results indicate that GA can be used to decide the best and optimal sequence for the college entrance examination voluntary report and it has high accuracy and fitness indeed.
Near Linear Time Community Detection Algorithm Based on Dynamical Evolution
REN Luo-kun, LI Hui-jia and JIA Chuan-liang
Computer Science. 2016, 43 (Z6): 395-399.  doi:10.11896/j.issn.1002-137X.2016.6A.094
Abstract PDF(1318KB) ( 537 )   
References | Related Articles | Metrics
Detecting communities is is crucial for analyzing and designing complicated natural and engineering network.The existing community detection algorithms rely heavily on optimization and heuristic methods,which can not balance computational efficiency and accuracy simultaneously.Thus we proposed an evolutionary algorithm which uses a new dynamical system based on community membership vector to formulate the conditions driving the convergence of dynamics trajectory.Then,we proposed a quality function,which can unify the conventional algorithms by selecting appropriate parameters.Furthermore,considering the difficulty in choosing parameters,we established a graph generative model according to the network prior information,by which the optimum formalism of the quality function can be obtained automatically.Our algorithm is highly efficient and the computational complexity is nearly linear with the number of all nodes in a sparse network.Finally,extensive experiments were performed in both artificial and real networks,which reveal much useful information.
Improved Collaborative Filtering Recommendation Algorithm
HUANG Tao, HUANG Ren and ZHANG Kun
Computer Science. 2016, 43 (Z6): 400-403.  doi:10.11896/j.issn.1002-137X.2016.6A.095
Abstract PDF(970KB) ( 541 )   
References | Related Articles | Metrics
The collaborative filtering recommendation algorithm is one of the most important recommendation technologies in E-commerce recommendation system,and the similarity measuring method plays a key role for the accuracy of recommendation results.However,the traditional similarity measure methods ignore the influence on recommendation quality resulting from the number of the common grading items between users.Given this situation,a novel approach was firstly proposed based on the number of the common grading items when measuring the similarity between users.Further more,to protect recommendation result from the data sparsity,the structural similarity measure method of complex network was employed to evaluate the similarity between users.The experimental results show that the proposed approaches can avoid the disadvantages of traditional methods effectively and improve the quality of the recommendation.
Event-based Node Influence Analysis in Social Network Evolution
XIONG Chao, CHEN Yun-fang and CANG Ji-yun
Computer Science. 2016, 43 (Z6): 404-409.  doi:10.11896/j.issn.1002-137X.2016.6A.096
Abstract PDF(1455KB) ( 566 )   
References | Related Articles | Metrics
Social influence analysis is an important research focus in the field of social network research,and most exis-ting works on influence analysis focus on static network.This paper proposed a method of influence analysis based on individual events for network evolution.The traditional diffusion model was improved to adapt the evolution,and the events that exhibited in the diffusion was defined.Then two indicators based on the individual events which are social index and influence index were given to measure the influence of node.The value of the two indicators of nodes were analyzed separately in the experiment to find out the important influential node in the influence maximization problem.Then the performance of the two indicators was compared.The results show that the nodes of social index have higher efficiency than nodes of influence index at the initial diffusion stage,but when the diffusion meets the bottleneck,nodes of influence index can break the bottleneck faster so that they can infect more nodes in the network.
Dynamic Filtering Algorithm of Connected Bit Maximum Likelihood Minwise Hash
CAO Yang, YUAN Xin-pan and LONG Jun
Computer Science. 2016, 43 (Z6): 410-412.  doi:10.11896/j.issn.1002-137X.2016.6A.097
Abstract PDF(644KB) ( 534 )   
References | Related Articles | Metrics
Maximum likelihood estimator (RMle) can improve the average accuracy.Connected bit Minwise Hash (RMinwise,c) can exponentially improve efficiency of similarity estimation.Dynamic threshold filter makes further improvement on efficiency.Combining RMinwise,c and dynamic threshold filter,the maximum likelihood dynamic filtering algorithm of Connected bit Minwise Hash was proposed.Experimental results demonstrate that R(TMle,c) can get second-best precision and efficiency,and is the most cost effective business in the estimator options (RMle,RMle,c,RMinwise,c,R(TMle,c).
Link Prediction Algorithm in Protein-Protein Interaction Network Based on Spatial Mapping
HONG Hai-yan and LIU Wei
Computer Science. 2016, 43 (Z6): 413-417.  doi:10.11896/j.issn.1002-137X.2016.6A.098
Abstract PDF(1521KB) ( 517 )   
References | Related Articles | Metrics
Protein-protein interaction(PPI) prediction is essentially the link prediction problem in the complex network.So far,many of the proposed link prediction methods either only consider topological information,or only consider the PPI interaction information within the network,but it is not enough.Therefore,this paper proposed a new method where the PPI network is represented as a weighted graph.In the graph,according to the two nodes’ topology information and attribute information,the topology similarity and attribute similarity can be calculated so as to predict whether there are links between the two nodes.In order to balance the two similarities,we considered the method based on spatial mapping,that is,the similarities are independently mapped to another space,and the spaces are made as close as possible,so as to fuse the topology information and attribute information fusion.The results show that the proposed algorithm has better accuracy and good biometric characteristic.
Research of Chinese Comments Sentiment Classification Based on Word2vec and SVMperf
ZHANG Dong-wen, YANG Peng-fei and XU Yun-feng
Computer Science. 2016, 43 (Z6): 418-421.  doi:10.11896/j.issn.1002-137X.2016.6A.099
Abstract PDF(1207KB) ( 665 )   
References | Related Articles | Metrics
In this paper,we used the machine learning method to classify the sentiment classification of Chinese product reviews.The method combines SVMperf and word2vec.Word2vec trains out each word of the corpus of word vectors.By computing the cosine distance between each other,a similar concept word clustering is achieved,and with similar feature clustering, the vocabulary of the high similarity in the field is expanded to sentiment lexicon.The high dimensional representation of the word vector is trained out using word2vec.PCA principal component analysis method is used to reduce the dimension of the high dimensional vector,and the feature vector is formed.We used two different method to extract the effective affective feature,which is trained and predicted by SVMperf,so as to complete the sentiment classification of the text.The experimental results show that the method can obtain good results,regardless using the similar concept clustering method to expand the task or complete the emotional classification task.
Computational Model of Average Travel Speed Based on K-means Algorithms
GAO Man, HAN Yong, CHEN Ge, ZHANG Xiao-lei and LI Jie
Computer Science. 2016, 43 (Z6): 422-424.  doi:10.11896/j.issn.1002-137X.2016.6A.100
Abstract PDF(967KB) ( 604 )   
References | Related Articles | Metrics
It is possible to retrieve real-time data using floating bus data acquisition system equipped with positioning and wireless communication apparatus.To explore traffic condition,a data fusion model based on the K-means clustering algorithm was put forward.The model was used to calculate the average travel speed between adjacent bus stops.At first,K-means clustering algorithm was improved:(1)the cluster number K is not predefined but the square root of non-identical sample size,and it is different at different sections and time;(2)the initial cluster center is not random but selected according to K.Then,the sample data were divided into K classes by the improved algorithm and the average travel speed was obtained by data fusion model.Finally,the average travel speed of four areas in Qingdao was shown by line charts to explore some evolution law of traffic flow.The research provides strong support for traffic management and residents travel.
Method of Traffic Anomaly Detection with Incomplete Data
WANG Yu-ling and REN Yong-gong
Computer Science. 2016, 43 (Z6): 425-429.  doi:10.11896/j.issn.1002-137X.2016.6A.101
Abstract PDF(1341KB) ( 556 )   
References | Related Articles | Metrics
Development of urbanization process has brought serious traffic problems,and traffic anomaly detection becomes one of hot spots in the field of data mining.The traditional traffic management mainly uses video monitoring which has a limited efficiency for handling traffic problems.A method of traffic anomaly detection with incomplete data(Traffic Anomaly Detection,TAD) was proposed in this paper.Firstly,the correlation clustering obtains vehicle density information from mobile phone data and reduces the computation costs of processing incomplete data.Secondly,an adaptive parameter-free detection algorithm is designed to capture the distributed dynamic anomalies with phone call volume change rate on the roads,solving the uncertainty problem of road condition.Finally,anomaly trajectory algorithm is devised to retrieve anomaly distribution route and forecast influence scope,improving the efficiency of anomaly detection.Experimental results show that TAD methods can effectively detect abnormal traffic in different experimental conditions and our algorithm is better in efficiency and scalability compared with existing algorithms.
Construction of VDEA and its Application in Lexical Sentimental Orientation Analysis
HUANG Jin-zhu, LI Feng and ZHANG Ke-liang
Computer Science. 2016, 43 (Z6): 430-434.  doi:10.11896/j.issn.1002-137X.2016.6A.102
Abstract PDF(1263KB) ( 545 )   
References | Related Articles | Metrics
Valency grammar focuses mainly on studying the deep semantic structure of sentences through analyzing predicates and attaches emphasis upon the dependent relationships between predicates and cooperative elements.Now the grammar is an edged tool to solve the problem of semantic analysis.The valency-based dictionary of English adjective (or VDEA) constructed in this study is a machine-readable dictionary based on valency grammar.VDEA covers 3170 English adjectives and relevant semantic information such as valency relationships,case relationships,explanation,commendatory and derogatory meanings,semantic classification,semantic features and examples etc.Based on the dictionary,the study designed a lexical sentimental orientation analysis model.The experimental result is satisfactory.
Opinion Analysis and Recognition of Comparative Sentences in User Views
WU Chen and WEI Xiang-feng
Computer Science. 2016, 43 (Z6): 435-439.  doi:10.11896/j.issn.1002-137X.2016.6A.103
Abstract PDF(1275KB) ( 491 )   
References | Related Articles | Metrics
There are many opinions of narrators in the comparative results of comparative sentences which come from the content of user views on the internet.This paper summarized some usual conceptual collocations in ten kinds of comparative sentences.We proposed a method of recognizing comparative sentences and analyzing the opinion of a comparative sentence based on the semantic chunks of a sentence.An experiment was done by analyzing the corpus in the 4th Chinese Opinion Analysis Evaluation,which requires recognizing comparative sentences and analyzing the opinion and elements in a comparative sentence.The experimental results are both better than the average of all systems in the evaluation.
Personalized Recommendation Method Based on Hybrid Computing in Two Layers of Community
HUANG Ya-kun, WANG Yang, SU Yang, CHEN Fu-long and ZHAO Chuan-xin
Computer Science. 2016, 43 (Z6): 440-447.  doi:10.11896/j.issn.1002-137X.2016.6A.104
Abstract PDF(2034KB) ( 571 )   
References | Related Articles | Metrics
Researching the inner structure of social network has great performance in community detection.A hybrid computing model can be constructed by the different levels of communities which have contact between them.Considering the hybrid computing model in two-layers community,we applied it to personal recommendation system.The method makes evolution from users-items diagram into three dimensional hybrid computing model,and constructs the different layer communities respectively by the fusion similarity.We also defined the hybrid computing layer based on the relationship in users and items. Defining different computing for new user,old users,new items and old items,HCPR can recommend the precise and diverse information.The experiments result show that the model has great performance in representing the relationship between users and items.Compared to the U-CF and I-CF,HCPR can ensure the precise of the recommendation and rich diversity.
Hadoop-based Public Security Video Big Data Processing Method
LIU Yun-heng and LIU Yao-zong
Computer Science. 2016, 43 (Z6): 448-451.  doi:10.11896/j.issn.1002-137X.2016.6A.105
Abstract PDF(1329KB) ( 487 )   
References | Related Articles | Metrics
Public security video surveillance technology has been developed from the integration phase of the network to the depth of the video application.Facing the continuous flow of public security video data,to study new big data processing means is necessary.According to the demand of public security video data,this paper adopted Hadoop technology based video data processing platform,and used face retrieval and recognition algorithm based on Map-Reduce,to realize the intelligent information processing of the public security video data to achieve the purpose of the application of the public security big data.
Classification Rule Mining Based on GEP
FU Hong-wei
Computer Science. 2016, 43 (Z6): 452-453.  doi:10.11896/j.issn.1002-137X.2016.6A.106
Abstract PDF(658KB) ( 475 )   
References | Related Articles | Metrics
Classification rule mining method and regression problems differ in that the target attribute of mining classification rules is discrete nominal value,and the target attribute of regression problem is the continuous and orderly va-lue.This article mainly introduced two main methods of classification rule mining implementd by GEP,and analyzed how to improve the fitness function to mine classification rules which is easy to understand.
Fast Clustering Algorithm Based on Cluster-centers
ZHOU Lu-yang, CHENG Wen-jie, XU Jian-peng and XU Xiang
Computer Science. 2016, 43 (Z6): 454-456.  doi:10.11896/j.issn.1002-137X.2016.6A.107
Abstract PDF(974KB) ( 512 )   
References | Related Articles | Metrics
To deal with the problem that classical k-means algorithm inefficiently adapt to clustering for all kinds of clusters,in this paper an algorithm which is improved on k-means algorithm using optimization cluster-center was proposed.It divides large or extended-shaped cluster into a number of globular clusters,and then merges these small clusters.Firstly,a group of cluster centers located in the high-density region are selected,and the object around the cluster center is divided to its nearest cluster center forming the sub-cluster.Then the merger is completed in accordance with sub-cluster connectivity between sub-clusters.Experimental results show that the algorithm can adapt to irregular shape cluster and is simple.
Behavior Specification Method of Class Based on Abstract State
WANG Wei, DING Eryu and LUO Bin
Computer Science. 2016, 43 (Z6): 457-460.  doi:10.11896/j.issn.1002-137X.2016.6A.108
Abstract PDF(855KB) ( 497 )   
References | Related Articles | Metrics
State WANG Wei DING Er-yu LUO Bin (Software Institute,Nanjing University,Nanjing 210093,China) (State Key Laboratory for Novel Software Technology,Nanjing 210093,China) Abstract Defining the specifications of the method can reduce the software error,and ensure the correctness of the program.But in object-oriented programs,methods influence each other,so better specifications methods are needed .The researchers try a variety of methods,such as abstract variable,state abstraction,heap, inspector methods and so on.In this paper,we gave a behavior specification method of class based on the abstract state.The method depends on abstract state to solute the shared dependency and influence between the class specification method,and realizes the independent description and validation runtime cohesion between specification and implementation.
Symbolic Execution and Human-Machine Interaction Based Auto Vectorization Method
CHEN Yong and XU Chao
Computer Science. 2016, 43 (Z6): 461-466.  doi:10.11896/j.issn.1002-137X.2016.6A.109
Abstract PDF(1705KB) ( 593 )   
References | Related Articles | Metrics
Auto-vectorization is a parallel compiling optimization technology for SIMD vector computing units.It combines multiple same operations into one SIMD instruction which can significantly improve the output of the system.As SIMD vector computing units are used widely,auto-vectorization technology has become the hot topic in both academic and commerce world.Focusing on the shortcoming of current auto-vectorization technology such as the difficulty to get the code that can be vectorized,the difficulty to select the best optimization schema and poor portability,we proposed a new vectorizing method based on the symbolic execution and human-machine interaction.The method contains two phases.At first,based on the symbolic execution technology,it recognizes the vectorizable code as much as possible.Then,the human-machine interaction technology is used for determining the exactly code to be vectorized.At the same time,the method has portability that can be used for other architectures by only modifying the pattern file.Application example shows that our new technology is feasible and effective.
Summary of Research on Similarity Analysis of Software
HUANG Shou-meng, GAO Hua-ling and PAN Yu-xia
Computer Science. 2016, 43 (Z6): 467-470.  doi:10.11896/j.issn.1002-137X.2016.6A.110
Abstract PDF(1212KB) ( 657 )   
References | Related Articles | Metrics
The similarity analysis of the software is to protect the intellectual property rights of software.This algorithm will not strengthen the program to increase its ability to resist the attack.It compares two or more than two procedures,to determine whether each contains.This algorithm includes clone detection,software forensic,software birthmarking and plagiarism detection.The most essential operation is the source code or binary executable file of the program.Program is converted into a more easily processing representation,in order to determine the similarity between two (or program fragments) programs,or one of programs (in whole or in part) contains the other.Finally the general form of the algorithms was summarized and the corresponding analysis of each algorithm was made.
Requirements Analysis Based on Grey Clustering Algorithm
HU Wen-sheng, YANG Jian-feng and ZHAO Ming
Computer Science. 2016, 43 (Z6): 471-475.  doi:10.11896/j.issn.1002-137X.2016.6A.111
Abstract PDF(1093KB) ( 462 )   
References | Related Articles | Metrics
The research results of James Martin et al.show that most of software faults come from the requirements phase.To improve the quality of software products,that the software requirements specification (SRS) must maintain consistency,correctness and unambiguous is very important.The functional requirement statements in SRS are made word division and speech tagging by natural language processing technology.Each functional requirement statement is converted to a weight vector based on the key words and the functional requirement statements with similarity semantics can be clustered by grey clustering algorithm.The results of clustering can not only facilitate the requirements analysis to review SRS,but also facilitate the developer and software maintenance to carry out activities.
Towards Understanding Existing Developers’ Collaborative Behavior in OSS Communities
CHEN Dan, WANG Xing, HE Peng and ZENG Cheng
Computer Science. 2016, 43 (Z6): 476-479.  doi:10.11896/j.issn.1002-137X.2016.6A.112
Abstract PDF(1292KB) ( 473 )   
References | Related Articles | Metrics
Understanding developers’ cooperative behavior is an essential step to meet the needs of collaborative deve-lopment activities.A number of studies have been carried out to explore the joining script of newcomers and the immigration process of developers in open-source software communities.However,few know about the actual collaborative behavior of existing developers in those communities.In this paper,we conducted an empirical study to gain an insight into how existing developers collaborate and which factors affect their collaborative behavior from the perspectives of both collective interaction and individual expertise.According to the data sets collected from the Sourceforge.net,the results show that different existing developers prefer different collaboration patterns,and short topological distance (also known as “friends of friends”) has very limited effect on their first collaboration,whereas the previous collaborations are positive to build new collaboration between the existing developers.By contrast,development environment (opera-ting system) and administrator experience seem to be important factors that affect their potential collaboration.The fin-dings are valueable for existing developers to maintain sufficient collaboration awareness,so as to improve the stability and sustainability of OSS communities.
Weight Distribution Algorithm for Massive Video Data Based on HDFS
GUO Jian-hua, YANG Hong-bin and CHEN Sheng-bo
Computer Science. 2016, 43 (Z6): 480-484.  doi:10.11896/j.issn.1002-137X.2016.6A.113
Abstract PDF(1322KB) ( 488 )   
References | Related Articles | Metrics
There is a big difference between the distributed computing based on the video data and the distributed computing based on the text type data.The video data are unstructured,and the same size of the video that has different content will lead to different execution time.For simple structured data,the default load equalizer of HDFS can solve the problem of load balancing.But the video file has the problem of different access times and complexity inconsistency.Using the default data distribution mechanism of HDFS are not well solve the load balancing problem.In this paper,a new algorithm for massive video data redistribution based on HDFS was proposed.Firstly,the access times and the history analysis time of the video file are recorded.Secondly,the data are quantified and weighted as the load of the video file.Lastly,the means of file replacement are used to exchange high load video and low load video,until each node achieves load balancing.Experimental results show that using the data redistribution algorithm proposed in this paper can reduce the processing time of massive video data.
Combined Query Expansion Method Based on Copulas Framework
ZHANG Shu-bo, ZHANG Yin, ZHANG Bin and SUN Da-ming
Computer Science. 2016, 43 (Z6): 485-488.  doi:10.11896/j.issn.1002-137X.2016.6A.114
Abstract PDF(1352KB) ( 469 )   
References | Related Articles | Metrics
Hybrid query expansion methods based on semantic and local analysis can provide time-sensitive extension results with semantic correlation.However,how to effectively combine two different kinds of similarity metrics has not been solved.This paper proposd a hybrid query expansion method based on the Copulas framework to implement the combination of different types of similarity metrics.Based on the query expansion methods of semantic and word co-occurrence analysis,the proposed method respectively calculates the semantical and the statistical similar probabilities between expansion-words and the query words submitted by the user.It then selects high quality extension words to obtain the final extension word set.The experimental results show that the method makes full use of the advantages of the two kinds of query expansion methods to improve the precision ratio factor.The method has better search performance.
Task Driven Data Transmission and Synchronization Method Based on XML
QIAO De-ji and XIAO Wei-dong
Computer Science. 2016, 43 (Z6): 489-492.  doi:10.11896/j.issn.1002-137X.2016.6A.115
Abstract PDF(1086KB) ( 472 )   
References | Related Articles | Metrics
Considering the situation of information security and business requirement,people browse and deal information on local area network which is logical isolated or physical isolated with each other,but people don’t want to lose the powerful data handling ability.Through data transmission and synchronization,to simulate wide area network and avoid information silo is a feasibility choice.This paper provided a task driven data transmission and synchronization method based on XML.This method analyzes data transmission scene,sets the data template of task and task result and recurs to specific information system to build task data package and task result package automatically.It transfers data by data disk,and synchronizes data by data addressing,location and import.By this data passageway,the isolated information systems are interconnected,and people deal with information just like on wide area network.This method enhances the security level of information system,reduces the cost of network line,solves the information silo caused by disconnection,and satisfies the requirements to use network of specific trade.
Real-time Data Loading of Dynamic Data Warehouse Using Index View Set
WU Tong and TAN Guang-wei
Computer Science. 2016, 43 (Z6): 493-496.  doi:10.11896/j.issn.1002-137X.2016.6A.116
Abstract PDF(1153KB) ( 520 )   
References | Related Articles | Metrics
With the vast employment of data warehouse technology,decision making supporting system using data warehouse technology is commonly used among corporations.At the same time,dynamic data warehouse appeares.As dynamic data warehouse becomes increasingly important in the decision making supporting field,corporations tend to more frequently employ decision making supporting system to assist tactical decision making,migrating from strategical decision making.However,the dynamically updated data in the data warehouse is the prerequisite to carry out tactical analysis.The key to realize the dynamic features of data warehouse is to obtain dynamic updated data,which is real-time data loading.This article proposed to use index view set to realize real-time data loading in the dynamic data warehouse and demonstrated the feasibility of the approach.It also shields light to further research on real-time data loading.
Data-flow Analysis for Software Error Detection
ZHANG Guang-mei and LI Jing-xia
Computer Science. 2016, 43 (Z6): 497-501.  doi:10.11896/j.issn.1002-137X.2016.6A.117
Abstract PDF(1125KB) ( 513 )   
References | Related Articles | Metrics
Definition and reference are two kinds of operations that software variable.The operation that software variable disobeys the variable using rules will result in software error.In order to detect these kinds of software error,the definition-reach data-flow analysis and living-variable data-flow analysis of a program must be made.There may be more than one path to a program’s site,and the data-flow states on one program path may be different from the others.So the must-data-flow and the may-data-flow of a program were calculated to depict the accurate data-flow information.The control structure on basic block is used by the data-flow analysis method.The factors that will affect the data-flow such as definition information that can reach to a basic block entry site and exit site,living variable that can reach to a basic block,some special operation such as memory allocate operation and memory free operation,and the relations between them were discussed sufficiently.
Application of Spark in Human Genome
DING Dong-liang, WU Dong-yue and YU Fu-li
Computer Science. 2016, 43 (Z6): 502-504.  doi:10.11896/j.issn.1002-137X.2016.6A.118
Abstract PDF(1106KB) ( 584 )   
References | Related Articles | Metrics
The human genome as a kind of the high-value and the precious big data is badly in need for efficient and accurate analysis.In view of the high-latency fatal weakness of the traditional cloud framework Hadoop,the cloud platform Spark emerges in response to the needs of times.The human genome data system based on Spark will make great contributions to the early detection or treatment of the disease and birth defects.
Research on Evaluation of Computer Network Operation Based on Capacity Factor
SHEN Pu-bing, ZHAO Zhan-dong and GONG Qiang-bing
Computer Science. 2016, 43 (Z6): 505-507.  doi:10.11896/j.issn.1002-137X.2016.6A.119
Abstract PDF(785KB) ( 602 )   
References | Related Articles | Metrics
Network operation capability assessment is a multi-index evaluation problem.The paper first analyzed the factors of network operation capabilities, and then built network operational network operational capability evaluation index system from computer network reconnaissance,attack,defense,support and command based on the principles of network operational indicators.
Improved Wavelet Neural Network Used for Prediction of Pollutant Emissions in Thermal Power Plants
SU Yin-jiao, SU Tie-xiong, WANG Da-zhen and MA Li-qiang
Computer Science. 2016, 43 (Z6): 508-511.  doi:10.11896/j.issn.1002-137X.2016.6A.120
Abstract PDF(915KB) ( 522 )   
References | Related Articles | Metrics
Wavelet neural network is a kind of neural network learning,and network structure is similar to the typical BP neural network. The function of the hidden layer is the wavelet basis function.The improved wavelet neural network has the obvious improvement in data prediction.The pollution problem of the power plant is related to the whole nationaleconomy and people’s livelihood.If we can apply wavelet neural network prediction ability in actual production process,it will help to promote national economic development,and improve people’s quality of life.
Design and Implementation of Intelligent Parking System Based on Wi-Fi Fingerprint Location Technology
HUANG Xu, FAN Jing, WU Mao-nian and GU Yong-gen
Computer Science. 2016, 43 (Z6): 512-515.  doi:10.11896/j.issn.1002-137X.2016.6A.121
Abstract PDF(1171KB) ( 553 )   
References | Related Articles | Metrics
An intelligent parking system was designed in this paper.It solves the problem of urban parking difficulty through the Wi-Fi fingerprint positioning technology to implement low cost deployment,card free operation,and intelligent navigation.The system also provides intelligent reverse seeking guidance and automatic payment in order to improve the efficiency of parking management and user experience.And it promotes the further application of information technology in parking management.Experiments show that this scheme can achieve the positioning requirements of the positioning accuracy for the intelligent parking system on the condition of low cost deployment.
Research on Prediction of Water Resource Based on Improved Combination Neural Network
WANG Jian
Computer Science. 2016, 43 (Z6): 516-517.  doi:10.11896/j.issn.1002-137X.2016.6A.122
Abstract PDF(686KB) ( 530 )   
References | Related Articles | Metrics
China is a big country of water resource and in the accelerating process of urbanization,it is facing a series of major challenges such as urban population growth and water pollution,etc,so scientific and rational forecasting to water resource demand becomes a key task to protect environment and maintain sustainable development.This paper summarized various neural network algorithms in the context of water resource demand forecasting,and introduced fuzzy feedback method to improve entropy method to determine the weighting factor of combination forecasting model,to establish neural network forecasting model.The algorithm can not only automatically deduce future change trends of water resources based on historical data,but also introduce feedback and evolution mechanism,so that the users can adjust the solution accuracy to control convergence speed of algorithm.The experiment shows that,the neural network based on combination model proposed in this paper has better performance in application background when data accuracy is not high and hydrological data is incomplete.
RFID Technology Realization in Android System
ZHAO Zuo-ren and LIU Ting-long
Computer Science. 2016, 43 (Z6): 518-522.  doi:10.11896/j.issn.1002-137X.2016.6A.123
Abstract PDF(1396KB) ( 756 )   
References | Related Articles | Metrics
The applications of Android smart phone system and Bluetooth wireless technology have become more and more widely used in the background of Things to Things.Each module of Android platform and wireless technologies become hot topics of the present study. We analyzed all the bottom-up technical details of Bluetooth of the Android platform,and connected Android device with RFID devices to use by Bluetooth,to promote the development and application of Bluetooth technology in the Android platform,thus to contribute to the development of Things.
Books Recommended Reading System Based on Android
DING Yong and ZHU Chang-shui
Computer Science. 2016, 43 (Z6): 523-525.  doi:10.11896/j.issn.1002-137X.2016.6A.124
Abstract PDF(787KB) ( 551 )   
References | Related Articles | Metrics
With the development of mobile internet technology,the mobile reading has become a kind of life habits.In order to help the reader find the favorite books in tens of thousands of “Sea of books”,this paper applied the classic FP-Growth frequent itemsets mining algorithm into book recommendation system.Based on the reader’s reading history records,the algorithm mines frequent book combination,extracts association rules satisfying minimum support and mini-mum confidence threshold.According to the rules,intelligent books can be recommended.Experiments show that,this system can offer rapid and accurate books recommended service for the readers.
Research on Haze Prediction Based on Multivariate Linear Regression
FU Qian-rao
Computer Science. 2016, 43 (Z6): 526-528.  doi:10.11896/j.issn.1002-137X.2016.6A.125
Abstract PDF(681KB) ( 607 )   
References | Related Articles | Metrics
This paper presented a method to predict the haze based on multiple linear regression analysis,whose sample is online update.First,the data of Beijing weather conditions are collected,including average temperature,humidity,wind and other meteorological data level and PM2.5,CO,NO2,SO2 and other atmospheric concentration data.Then,the main influencing factors are analyzed by scatter plot of these factors,and the main factors,which have a great effect on the haze,are selected as the haze forecast basis.Furthermore,PM2.5 prediction model is built by multiple linear regression method.The prediction results combined with meteorological factors are in the form of the criterion of haze judgment.Finally,this paper provided an actual example for haze prediction,which utilizes multiple linear regression to forecast the weather condition of Beijing after one day,three days and seven days.
Design and Implementation of Emergency Receiving Terminal Based on FM
HE Gui-li, HU Zi-jian and JIANG Zi-quan
Computer Science. 2016, 43 (Z6): 529-532.  doi:10.11896/j.issn.1002-137X.2016.6A.126
Abstract PDF(892KB) ( 589 )   
References | Related Articles | Metrics
In order to solve the problem of receiving the public emergency information in complex disaster scenarios,this paper realized a kind of emergency receiving terminal,which is based on the traditional smart phone and software design.The terminal can receive RDS which broadcast from FM network in time,and warn people.By this way,it can make up for the lack of public emergency information receiving in Public network paralysis.
Research and Application of Cloud Push Platform Based on Multi-source and Heterogeneous Data
LU Jia-wei, WANG Chen-hao, XIAO Gang and XU Jun
Computer Science. 2016, 43 (Z6): 533-537.  doi:10.11896/j.issn.1002-137X.2016.6A.127
Abstract PDF(1378KB) ( 579 )   
References | Related Articles | Metrics
In the traditional push,the push of multi-source heterogeneous data is weak on timeliness,low security and difficult to reuse.According to the characteristics of heterogeneous data,characteristics of mobile Internet security and privacy,a multi dimensional decision cloud push model was proposed,to compute the eigenvalues and eigenvectors of multi-source and heterogeneous data in distributed environment,for the rapid separation of data source in homogeneous data and heterogeneous data.And based on this model,the cloud push platform was designed,and the automatic separation and high efficiency push of homogeneous data and heterogeneous data was realized by using cloud push technology.According to the running situation and related index analysis of the cloud platform in the experimental environment,the platform can be applied to the multi source and heterogeneous data.
Chinese Stock Market Efficiency Testing Based on Genetic Programming
WANG Hong-xia and CAO Bo
Computer Science. 2016, 43 (Z6): 538-541.  doi:10.11896/j.issn.1002-137X.2016.6A.128
Abstract PDF(908KB) ( 534 )   
References | Related Articles | Metrics
There is a contradiction between the modern capital market theory and the financial investment practice.And the contradiction is about the effective market hypothesis and the technical analysis.The use of the popular technology trading rules to examine the effectiveness of the stock market may lead to two types of conclusion deviation.The tree structure is used to represent the candidate solutions in genetic programming which can well describe the technical trading rules.The genetic programming algorithm is used to generate technical trading strategy in this paper.The strategy is used to test Shanghai indexes and five stocks in the Shanghai and Shenzhen stock markets.The back test results show that genetic programming generates the best technical trading strategy with significant excess profit compared with buy-and-hold strategy and the usual popular technical indicator.Therefore,the conclusion can be made that Chinese stock market has not achieved weak-form efficiency.
Flight Delays Early Warning Management and Analysis Based on Data Mining
LUO Feng-e, ZHANG Cheng-wei and LIU An
Computer Science. 2016, 43 (Z6): 542-546.  doi:10.11896/j.issn.1002-137X.2016.6A.129
Abstract PDF(1444KB) ( 609 )   
References | Related Articles | Metrics
It is still a hard task to model domestic flight delay and to evaluate flight consequence in an authorized way at present.In this paper,according to a large number of airlines’s historical operation data,a mathematical flight delay forecast model was derived to estimate the delay of each flight by analyzing the main factors influencing the flight delay,the consideration of the scheduled flight delay rate,the average time of delay and the delay passengers,and the Markov theory which is based on data mining prediction model establishment method and is used to forecast the evaluation index.The weights are gained by using fuzzy analytical hierarchy process (FAHP),and the condition of flight delay is evaluated based on fuzzy comprehensive evaluation.Finally,the flight delay alarming index system was constructed.The experiment of the developed delay forecast model shows that these alarming indexes can reflect the condition of flight delay accurately,and the outcome is objective,so it can provide support for flight delay early warning management theory and method.
Design and Implementation of Personnel Management System of SME Based on C/S Structure
ZHANG Meng
Computer Science. 2016, 43 (Z6): 547-550.  doi:10.11896/j.issn.1002-137X.2016.6A.130
Abstract PDF(909KB) ( 549 )   
References | Related Articles | Metrics
Chinese enterprise information management is developing gradually to the modernization and high efficiency,so office automation process has become necessary,and a modern personnel management system can help enterprises to improve their competitiveness.Based on the characteristics of the human resources of small and medium-sized enterpriseses,a personnel management system based on C/S structure was given in this paper.Using this system in personnel management can regulate the personnel system,save cost of human resources,and improve office efficiency.
Internet-based Electric Submersible Pump for Remote Monitoring of Temperature and Pressure
ZHANG Hong-wei and DANG Rui-rong
Computer Science. 2016, 43 (Z6): 551-554.  doi:10.11896/j.issn.1002-137X.2016.6A.131
Abstract PDF(975KB) ( 510 )   
References | Related Articles | Metrics
Because the number of wells is big and they usually distribute far away from each other over a wide range which undoubtedly increases the intensity of labor and affects the real-time detection of equipment.Based on this status,an Internet-based system of electric submersible pump used for remote monitoring of temperature and pressure was proposed.STM32F072 is used as the core chip in this system,the wireless sensor network module ZigBee is used to collect wireless data from close range,and the generic 12864 LCD is used to display the data and make the system access to WiFi modules to realize the network transmission of data in a server with a static IP.Android client establishes the same TCP connection through servers and receives the real-time data to realize the remote network transmission of data.
Format Description of Computer Applications and Software
LI Xiu-yun
Computer Science. 2016, 43 (Z6): 555-557.  doi:10.11896/j.issn.1002-137X.2016.6A.132
Abstract PDF(701KB) ( 491 )   
References | Related Articles | Metrics
With the rapid development of information technology,traditional approval process faces many problem,such as low efficiency,high costs and hard coded.This paper put forward an online approval process algorithm based on activiti framework.Online approval process was optimized.It improves the efficiency of the approval process,and has obtained the good effect in the practical application.
Key Technology of Access Network Supporting in Intelligent Power Distribution Business
CHEN Yan, WU Zan-hong, WANG Bo, REN Hai-jun and KONG Wei-chan
Computer Science. 2016, 43 (Z6): 558-560.  doi:10.11896/j.issn.1002-137X.2016.6A.133
Abstract PDF(778KB) ( 539 )   
References | Related Articles | Metrics
Intelligent control technology can effectively enhance the practical construction efficiency of smart distribution network,and effectively improve the level of safety factor of the energy supply.Hence,to strengthen the distribution network intelligent construction level is of important significance.Through comprehensive analysis of smart distribution grid technology research and area intelligent system construction situation,we explored the smart distribution grid technology and related support system application technology.At the same time, we carried out the further study on the regional power grid intelligent technology.The distribution automation system construction of geography information system and panoramic information intelligent transportation technology and related key technologies were analyzed. We introduced the existing problems of the actual operation and the status,hoping to provide help for the comprehensive construction of power distribution network.
Optimized Design of LANDMARC RFID Location System
TIAN Ye-fei and WANG Shu-che
Computer Science. 2016, 43 (Z6): 561-562.  doi:10.11896/j.issn.1002-137X.2016.6A.134
Abstract PDF(475KB) ( 519 )   
References | Related Articles | Metrics
In the RFID locationing system,the accuracy of LANDMARC locationing algorithm is related to the selection of the reference tags.The traditional algorithm is limited to a small range of 3 to 5 reference tags in the positioning environment.In the larger positioning environment,the positioning accuracy is poor.Therefore it is necessary to select the reference tag to optimize the positioning accuracy of the system.Finally,the optimized system was simulated,and the positioning accuracy of the system was verified.
Design Robot System Based on Stereo Perception Technology Computer Engineering and Applications
MA Wen-zhuo and ZHANG Jie
Computer Science. 2016, 43 (Z6): 563-567.  doi:10.11896/j.issn.1002-137X.2016.6A.135
Abstract PDF(1112KB) ( 475 )   
References | Related Articles | Metrics
The traditional precision machining process is realized by three steps,which are target location,program control and automatic operation.In the target location,the accuracy of the traditional method to fixed material is difficult to be guaranteed.With the increase of equipment wear,the accuracy of the production line is continuous decline,which makes automatic production cannot complete the high accuracy of the processing needs.So,it is needed to use the three-dimensional positioning technology to take the relative position information converted to the intensity of the wireless signal,and get the quantification positioning information.Using this RSSI stereo positioning method,we treated the processing material and automation manipulator to three-dimensional modeling,and set parameters which can be identified by program control,then solved the difficult problem.The high accuracy of this system will not be reduced by the physical wear.The quality of the production gets guaranteed.
Decision Analysis of Customers in Vacation Queueing Systems Based on Additive Time Effects
WANG Xiao and LI Ji-hong
Computer Science. 2016, 43 (Z6): 568-570.  doi:10.11896/j.issn.1002-137X.2016.6A.136
Abstract PDF(919KB) ( 493 )   
References | Related Articles | Metrics
In this paper,the additive time effect was introduced into the vacation queueing system.From the view of customers’ decision,two entry probabilities in the equilibrium and social optimality are obtained to present the relationship between two probabilities which shows that more concerns to personal benefit will induce to a heavier system load and affect the social benefits.To parallel personal and social benefits,the fixed entry fee is given based on the ex-ante externalities and three variable entry fees are presented from the view of ex-post externalities to enable the different companies to apply different fee mechanisms.
Application of Distributed Virtualized Storage in Public Security College
ZHU Kang-lin
Computer Science. 2016, 43 (Z6): 571-576.  doi:10.11896/j.issn.1002-137X.2016.6A.137
Abstract PDF(1579KB) ( 616 )   
References | Related Articles | Metrics
With the development of cloud computing and bid data,traditional SAN based on storage technology cannot meet the new need.With the emerging need of huge storage and applications on web,public security college still needs a new storage framework.Through the refresh of software and new function,the performance has been improved.The development of the product has changed the evolving based on hardware.In this paper,we introduced the work using vSAN and applications on Shanghai Police College.