Applied Mechanics and Materials
Vol. 526
Vol. 526
Applied Mechanics and Materials
Vol. 525
Vol. 525
Applied Mechanics and Materials
Vols. 522-524
Vols. 522-524
Applied Mechanics and Materials
Vol. 521
Vol. 521
Applied Mechanics and Materials
Vols. 519-520
Vols. 519-520
Applied Mechanics and Materials
Vol. 518
Vol. 518
Applied Mechanics and Materials
Vols. 513-517
Vols. 513-517
Applied Mechanics and Materials
Vols. 511-512
Vols. 511-512
Applied Mechanics and Materials
Vol. 510
Vol. 510
Applied Mechanics and Materials
Vol. 509
Vol. 509
Applied Mechanics and Materials
Vol. 508
Vol. 508
Applied Mechanics and Materials
Vol. 507
Vol. 507
Applied Mechanics and Materials
Vols. 505-506
Vols. 505-506
Applied Mechanics and Materials Vols. 513-517
Paper Title Page
Abstract: First, this paper introduces some technology about the junk SMS filtering, analyzing the principle and model characteristics of junk SMS filtering, which are based on the Bayes Algorithm. It also gives the Simulation results and the framework process of Bayesian filter. At the same time, it proposes an improved method of the Bayesian filter, which can increase the accuracy rate. Eventually it provides the conclusion.
1197
Abstract: In order to let the computer to recognize the handwritten digits of our lives in common better and faster, on the basis of the advantages of the extraction method in the structural characteristics, this text designed a new characteristics extraction method based on the Euler algorithm. With fewer eigenvectors to retain the key information in the topology of 0-9 each of the numeric characters.And to a few special writing figures ,we conducted a special simplify processing, easily feature extract, easily identified. Experiments show that, this feature extraction method is simple and effective, saves the time of feature extraction, has somewhat improved the recognition rate compared with other algorithms, was confirmed the effectiveness and practicality of the new algorithm.
1202
Abstract: Unpredictable transaction requirements of IT business lead to miss design the right size of data center. Over design data center contributed to surplus capital investment and lifetime operations. Legacy data centers designed before the 2nd millenniums over design capacity more than 60% of actual load. The research objectives are created a model transformation approach from legacy data center to mobile and modular data center M2DC and proposed multivariate optimization for the right sizing of data center as business needs by using case study. The research method is investigation and assessment through 21 sample data centers and in-depth interviews with IT managers (32) and data center consultants (8). The fact findings have shown the standardized modular of M2DC force requirements to fit in the building boxes and expansion as needs.
1208
Abstract: The paper gives a general description of a typical combinatorial optimization problem-the collocation problem of production, and respectively establish the integer linear programming model based on the collocation ways and the integer linear programming model based on the optimal collocation amounts and the integer nonlinear programming model based on the local optimum. In the solving methods of the models, we have mainly analyzed the enumeration method and the method by using LINGO optimization software to solve the model. The instance shows that the models and solutions are all effective. The first model reflects the mechanism of the collocation problem better and gets the global optimal solution most likely, but more difficult to solve large-scale; the second model can quickly obtain the optimal the numbers of the finished products and all kinds of materials consumptions, but still need to enumerate the collocation ways step by step; the third model can gets the local optimal solution and the specific collocation ways through solving the model circularly. The optimization models and solution methods in this paper can be extended to the similar sub-problems of the combinatorial optimization problems, such as assembly problem, packing problem, the cutting problem, etc.
1215
Abstract: This paper proposes a statistical coding methodology using covert side channel information to solve timing packet security issue, the main purpose here is to enhance the security of the timing protocol with backward compatible capability. In wireless communications, either ad-hoc military/ industrial network, or LTE/ LTE-A networks, GPS is used to provide time and location; however, the hackers often trying to spoof the signal. The alternative way of providing such signal is using protocols like IEEE1588 Precision Time Protocol (PTP); unfortunately, current timing packet is not encrypted, it can be altered by the hackers. To maintain the simplicity of such protocols, most vendors are reluctant to add encryption on top of it; nevertheless, the end customer wishes to see it. To solve this dilemma, we propose a backward compatible solution here. The basic idea is demonstrated using Matlab FFT calculation tool. The future extension using Fractional FFT is also suggested at.
1221
Abstract: In this paper, the Maximum Likelihood (ML) method using Quantum-behaved Particle Swarm Optimization (QPSO) algorithm is applied to the problem of estimating the direction-of-arrival (DOA) and the Doppler frequency of multiple signal sources impinging on an antenna array simultaneously. The extended observability matrix containing the angle and frequency parameters is established using state-space model. The ML method is used to estimate the parameters accurately and convert the problem from parameter estimation to a nonlinear multidimensional function optimization. Then the DOA and Doppler frequency are estimated by optimizing the likelihood function using QPSO algorithm. The proposed method reserves the asymptotic unbiased estimation of the ML method and reduces the computational burden of calculating the solution. Besides, the estimated parameters can be paired automatically.
1227
Abstract: Traffic analysis attacks are always adopted to threaten the safety of base station in a wireless sensor network (WSN) which uses LEACH routing protocol. Typical packet traffic in such a network may expose the pronounced patterns which would allow an adversary analyzing traffic to deduce the location of base station. The paper proposes two defending algorithms based on LEACH routing protocol to deal with these traffic analysis techniques and conceal the true traffic pattern. First, a Random Hot Spots Algorithm (RHSA) has been developed to randomly construct several communication areas with fake local high data to protect the base station of a WSN. Secondly, an Anonymous Communication Algorithm (ACA) is proposed to hide identities of all nodes that participate in the process of packet transmission by generating pseudonyms dynamically. Simulation results show that RHSA and ACA can effectively deceive and misdirect the adversary, and protect the location of a base station against attacks.
1233
Abstract: Finding suitable information in the open and distributed environment of current simulation information is a crucial task. One of the main challenges is to overcome semantic heterogeneity. This paper draw on the ontology, Semantic Web and cloud computing, put forward an ontology-based methodology for simulation information integration and service discovery, using XML Schema to realize the formatting description of resources, using ontology to realize the formal description of resources, using cloud computing to build an information resources pool, adopting SOA and Web services to realize the automatic discovery, match and combination of information services. Finally, an example is tested to verify the feasibility of the methodology.
1241
Abstract: Disconnection is the real one which influences the transmission of data and media in the wireless environment. Due to the influence on the transmission quality, the disconnections are divided into temporary disconnections and longtime disconnections. To solve the disconnections, Wireless TCP Disconnection Management was propounded, which concludes TCP Fast Reconnection Mechanism for the Temporary Disconnection and User-Defined Times of TFRMTD. First, TFRMTD mechanism in the wireless was discussed in detail, which implements the fast reconnection through giving every client a unique ID and catching the results at the server. Second, UDTT is introduced, which discusses how to judge the failing of TFRMTD on the disconnections and then conclude its a longtime disconnection.
1246
Abstract: Up to now, the common method of reservoir well group that is dynamic connectivity, it mainly includes tracer testing, stress testing, well testing, and numerical simulation. The implementation of these methods is more complex, expensive, high cost, and will affect the normal production of the oilfield. Because of the convenient injection and dynamic data it can get convenient. This paper presents a method that using dynamic reservoir development data inverse well group connectivity. CART algorithm analysis and extraction of potential knowledge from the oilfield development. It establish direct mapping of logging data and well group connectivity relationship. Experiments show that using dynamic data to study well group connectivity relationship can greatly reduce the cost and as a result has a higher accuracy.
1252