Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: September 2013
Authors: Chen Huang, Yan Huo
Taking an example of huge customer carrier source, it is integrated from operations, billing, accounting and other systems of customer data.First of all, according to theory of the formation of data mining customer feature table is gained and unified, and then the feature set is preprocessed and is optimized by eliminating the redundant features.
The system output from the neural network in this group ultimately determines the attribution result. 3.2 Pretreatment Profile data as feature is from a variety of systems, including some noise data, so before the prediction model trains first it is need to preprocess the data of customer features.
After the feature reduction of strong correlation, its result in behavior feature set as: customer gender, time in the net, monthly spending power, the use of model information, data, business information, marketing information.
After repeated training, making the network output is training completed, this network model has the ability to identify a particular terminal class. 3.4 Application and Evaluation To validate this model results, set one group of 3G terminal industry marketing an example, through the integration from the business, billing, accounting and other systems of customer data, according to terminal demand classification theory, the formation of a group of the feature set X = {X1, X2, X3, X4, X5, X6,}, including: customer gender, time in the net, monthly spending power, the use of model information, data, business information, marketing information.
Based on historical sales data, feature set with the corresponding 2000 group is selected, which is used to select the 1500 training sets of feature sets, 500 sets of feature for the test, Tab.1 shows the test accuracy rate is nearly 80%.
Online since: December 2010
Authors: Nariman A. Enikeev, Maxim Yu. Murashkin, Ruslan Valiev, Xavier Sauvage, Vil U. Kazykhanov
The data processing was performed using the GPM 3D Data software®.
Table 1 displays the mechanical tests data for the 1570 and 6061 alloys in coarse-grained (CG) and HPT-processed states.
These data demonstrate that UFG 6061 and 1570 alloys exhibit significantly increased strength as compared to CG states.
For comparison, data reported by Tsuji for a UFG 1100 Al alloy obtained by ARB [9] and data reported by Furukawa et al for the UFG Al-3%Mg alloy obtained by ECAP [10] were also added on the plot.
Fig. 3 shows some data collected on the 1570 alloy processed by HPT.
Online since: May 2011
Authors: Chao Sun, Qin Jun Du, Xing Guo Huang
This drastically reduces the chance of data loss and ensured system reliability.
Each joint of the robot is actuated by a Maxon DC motor with harmonic reduction.
The Linux application program for human machine interaction accepted instructions from console terminal, to get on-line data from Memolink or off-line planning data from file of desired trajectory and then transmitted them (trajectory data and control parameters) to real-time tasks through FIFO pipeline which is a mechanism to communicate between real-time tasks and ordinary Linux processes.
On one hand, when the Linux application program write data to FIFO_CTRL pipeline by POSIX write function, the RT-Linux FIFO handler thread created with the rtf_create_handler API function would be invoked automatically and get data from FIFO_CTRL pipeline with the rtf_get API function.
On the other hand, the Linux application program received the joints position data, robot body posture, control status and other sensors data from the real-time tasks via FIFO_DATA pipeline and displayed them on the terminal.
Online since: March 2014
Authors: Dan Ioan Stoia, Horia Hărăgus, Radu Prejbeanu, Dinu Vermesan, Simona Vermesan
In order to represent the reaction forces at the plantar level and to average this force for the lot of patients, the raw data recorded with FDM platform were processed.
Processing the data in this meaning refers to extraction of the data from the platform’s sensorial matrix, computing the resulting force and aligning the data in order to be graphically represented.
Matlab code used for data processing: data = xlsread('vectori_postop.xlsx'); Vs1 = data(:,1); Vd1 = data(:,2); Vs2 = data(:,3); Vd2 = data(:,4); f=61; Fs1 = Vs1(1:f:length(Vs1)); Fd1 = Vd1(1:f:length(Vd1)); Fs2 = Vs2(1:f:length(Vs2)); Fd2 = Vd2(1:f:length(Vd2)); ...
An important issue to underline here is that when we intend to average the reaction force during one step for a lot of patients, we have to normalize the data.
The normalization refers not only to the time (which war normalized here) but also to the data itself.
Online since: March 2013
Authors: Sumuntana Anuchatkidjaroen, Thawatchai Phaechamud
Then least square fitting in vitro release data to the different mathematical expressions (power law, first order, Higuchi’s and zero order) was carried out using Scientist 2.1 Programme.
Least square fitting the experimental data (cumulative drug release >10% and up to 80%) to the mathematical equations (power law, first order, Higuchi’s and Zero order) was carried out using Scientist® for Window, version 2.1 Programme.
Addition of BB caused significant reduction of pour point and cloud point greater than NMP.
Online since: November 2012
Authors: S.S.R. Koloor, A. Arefnia, J. Mohd Yatim, I.S. Ibrahim, M. Khalajmasoumi
The experimental data confirmed well the simulation results and states the accuracy of the analysis process.
The results are compared well using experimental data and the discussions are presented in terms of Load-Deflection of the sample and stress analysis of HDPE polymer under flexural condition.
The three-point bending experimental test of HDPE polymer is performed and the response is recorded as load deflection curve [7] which is showing a smooth reduction of flexural stiffness of the polymer panel that indicate the nonlinear deformation of polymer due to its hyperelasticity behaviour.
Online since: August 2014
Authors: Hao Li, Jun Feng Qiao, Zuo Zhi Shao, Yun Peng Li
SSL tunnel is established between the client and the SSL VPN server, is designed to securely transmit data over the Internet or other public network untrusted.
As below Fig.2 Operations in each operations layer The main control interface is responsible for controlling data input and output, such as 32-bit input data can be transformed into 163-bit data which can be processed directly; basic parameter of encryption storage system: including a base point, the private key, the elliptic curve, the other public key; choose the operation which is encryption or decryption operation or operation of control group in operating level.
Improved Methods and Theoretical Basis Let g is an integer of far less than m, m is the length of the input data of modular multiplication, if the length of input data is g bits, and they are processed in one time, and then there will be m/g cycles needed to be conducted.
The data path of this modular multiplication method as shown below, where d is the number of cubic of which is the second term where is not null in the domain polynomial F(x).
We give improved data in the end.
Online since: February 2013
Authors: Wen Xia Liu, Ji Kai Xu, Hong Yuan Jiang, Yong Tao Shen
So it is very necessary to develop a new method to obtain the reliability parameters of the transmission lines through the comprehensive utilization of line reliability information on the condition of lack of reliability data.
Principal Component Analysis is a standard data reduction technique which extracts data, removes redundant information, highlights hidden features, and visualizes the main relationships that exist between observations.
Usually, the solving of the main component of the line data set X may be converted to the characteristic root and standard orthogonal vectors of X’s correlation matrix: Step1: we standardized the line original data X=(xij)n´p. n is the line number of samples, p is the dimension, xij is the j-dimension value of the line i. x1,...
Step 2: Calculate Line sample dimension’s correlation coefficient matrix R=(rij) p´p (2) where Cov(xi,xj) is the covariance between the i column and j column in the line data matrix.
Based on these ideas, the principal component regression can be summarized as follows: Step 1: Analysis the standard data with Principal component analysis and extract the main component of all data.
Online since: April 2015
Authors: Benjapon Chalermsinsuwan, Pornpote Piumsomboon, Pilaiwan Chaiwang
The kinetic parameter from the analytical method then was consistent to the literature experimental data.
Comparing to the experimental data, the R2 for analytical method were higher than 0.95 for all the heating rates.
Ozawa, A new method of analyzing thermogravimetric data.
Kinetic analysis of thermogravimetric data
Doyle, Estimating isothermal life from thermogravimetric data.
Online since: September 2013
Authors: Wen Bo Zhang, Ming Jie Mao, Qiu Ning Yang, Isamu Yoshitake
They developed the formula by referring to 1900 strength data given in previous reports worldwide.
Input data for the simulation The objective fly-ash concretes for the simulation are given in Table I.
Several data for the simulation are summarized in Table II.
All input-data except for the compressive and tensile strengths refer to the data provided in JSCE design code.
The strength data are estimated by using the equations (1)-(3).
Showing 22351 to 22360 of 40389 items