Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: June 2014
Authors: Sonia Braunstein Faldini, Leila F. de Miranda, Renato Meneghetti Peres, C.Y.U. Peres, Cesar Denuzzo, Antonio H. Munhoz, Gabriel Cavalcante Gomes
For the samples dried at 110oC/24hours, diffraction data were recorded with a RigakuMiniFlex with a fixed monocromator.
The collected data were compared with the 10-173 ICDD data from The International Centre for Diffraction Data® (ICDD®).
Data of the composite.
The Deflection Temperature data (HDT) shows that the addition of pseudoboehmite and octadecylamine promoted the increase in the HDT data.
The Melt Flow Rate Test data (Table 2) shows increased flowing properties of the composite.
Online since: July 2013
Authors: Lin Li, Rong Nie, Lin Bin Jia
So we consider both types of features. 2.2 Dissimilarity Representation and Data Preprocessing An appropriate representation of samples is based on data.
And 200 examples are used as training data.
All the detection accuracy is low when fewer features is used on both original data and transformed data.
It’s fairly to say that if the original data cannot be described, the structure of these data isn’t exists either.
So there is no ground that the transformed data may absorb the structure information of the data.
Online since: January 2013
Authors: Heng Ya Guo
The data show that, reverse engineering can shorten the product design and manufacture of the life cycle of more than 40%.
Reverse engineering have very good application in the new parts design, has been part of the copy, damaged or worn parts of reduction, model accuracy, art and archaeological relics replication, product design perfect, digital model checking, etc. we should pay attention to the following contents in the process of using reverse engineering technology to mold optimal design: (1) Data collecting, we improve the scanning data collection point precision and efficiency, which use advanced characteristics identification technology to reduce artificial intervention, realize data acquisition process automation in the scanning process
(2) Data processing, the effective method to point cloud data processing process was improved to eliminate various interference factors, build unified data format conversion standards, and reduce data distortion and lost
Splicing the point cloud data to obtain the point cloud data as shown in Fig. 5.
The Imageware software will take measurement data processing as follows: mixed point and noise point removal, data of the compact, data block and curve surface reconstruction, and then march the suture, cutting, transition in the use of UGNX software.
Online since: January 2014
Authors: Lan Tian, Qing Hua Song, Xiao Shan Lu
They are the user terminal device (UTD) and the data processing center (DPC).
The audio data acquisition module collects the audio data from the audio outlet of TV set.
The original audio data from audio outlet of TV set is sampled and stored into the data buffer.
In order to further compress features data and only retain the feature contour of one package data, the L frames energy features in corresponding bands also can be averaged to
The standard audio database stores all channels’ TV program audio data.
Online since: October 2009
Authors: P.J.S. Foot, H. Hadavinia, V.G. Izzard, C.H. Bradsell, V.J. Morris, N. Witten
Little however has been reported on the data from the compression set test itself.
There is good agreement between the two data sources.
This is illustrated by the data in Fig.1 for all temperatures investigated.
This observation cannot be applied to the 25% initial fixed compressive strain data.
As shown in Fig.1 this data is more widely scattered than that obtained from the 50% initial fixed compressive strain tests.
Online since: July 2014
Authors: Hai Zhen Wang, Da Hui Li, Zuo Zheng Lian
Where M is the number of BP neural network training data, N is the smallest scale, namely, to predict the scale coefficient of the first M+1 data by the scale coefficient of the last M data; BP neural network 1 to N can use the same structure, parallel processing the N sets of data at the same time, so generating a new set of scale coefficient of each layer, as well obtaining the scale coefficient prediction output value by means of the trained network.
Step 7:Getting the multiplier by the above step 2 to step 3, then using it for the actual data to initialize the AR model, estimating parameters of AR model, and then according to (11), getting the prediction multiplier; (11) Step 8: Calculating on the basis of prediction multiplier, and to conduct wavelet inverse transformation combining with the scale coefficient, generating prediction data, as shown in fig. 1.
Predicting the first M data by the the scale coefficient prediction value of last M data and multiplier AM + 1, and so on, the first M+2 data can be predicted, etc.
Fig. 1 Predicted data generation method Fig. 2 BP neural network structure Three layers BP neural network can approximate any nonlinear function [4], as shown in fig. 2, the circles represent neurons nodes, number of nodes in each layer references experience value and the experiment value.
We conduct on experiment by "the Silence of the lambs" (from the university of Berlin mpeg-4 video tracking database [5]), As known in fig. 3, it may be observed that when the size of a video frame doesn’t change sharply, the difference is small between the predicted values and the real value; When the amount of data change severely, the difference between the predicted values and the real value also increases, but predicted value can reflect the change trend of real data to a certain degree.
Online since: September 2014
Authors: Tai Fu Lv
High-density network Intrusion detection expert system uses data-driven reasoning mechanism.
Principal component analysis of high-density network intrusion detection data is as follows: (1) High-density network data standardization.
After studying the neural network modeling, high-density network intrusion data can be detected.
(3) The expert system is considered to be normal behavior data principal component analysis and then extract the main features of the data
The Core of Attributes and Minimal Attributes Reduction in Variable Precision Rough Set [J].
Online since: August 2013
Authors: Ying Wu, Bin Li
Under voltage of 220V, this article has carried out ignition test of three categories of combustible substance including polyurethane foam, cotton cloth and newspaper by simulation of short-circuit fault of aluminum conducting wire, so as to get the characteristic parameters of ignition of combustible substance due to short-circuit bead at different heights, summarize the distribution of fire origins for ignition of polyurethane foam due to short-circuit bead at different heights, and provide detailed fundamental data and effective technical support to fire disaster researchers of primary-level fire protection during onsite investigation.
Introduction During the actual work of electric wire/cable, due to the fact that line is subject to long-term external conditions such as dampness and corrosion, aging of wire/cable is accelerated, causing reduction of insulation performance.
See the specific data of ignition of different combustible substances at different short-circuit height as per table 5.6.
These fundamental ignition data can provide detailed fundamental data and effective technical support to fire disaster researchers of primary-level fire protection during onsite investigation.
Online since: July 2014
Authors: K.R. Kiron, K. Kannan
The data collection is through a field survey based on a questionnaire prepared.
Data Collection: Based on the factors mentioned above, a questionnaire was prepared after the discussion with the experts in the field.
The study is empirical in nature, as statistical and other primary data had been collected through field survey.
The data was collected from 15 steel manufacturing MSMEs from Kanjikode, Kerala state, India.
Production flexibility and Production cost Most of the organizations are concerned with the reduction of production cost.
Online since: July 2014
Authors: Xi Zhao, Guang Ming Li
Most commercial banks have put their business and data processing into centralized IT&Data centers.
As a result, centralized IT&Data centers of banks have become the backbones of bank business and data processing concentrate voluminous computation, data storage and network resources.
Procedure and operation in a task are defined according to business rules and data stocking and may require or produce ephemeral data.
Discussion Bank IT&Data centers have centralized all business processing and data storage and become the backbones of bank business running and development.
MapReduce: Simplified data processing on large clusters.
Showing 17071 to 17080 of 40367 items