Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: January 2011
Authors: Hong Jiang, Yan Hua Huang, Mamtimin Gheni, Yong Fang Shi
Geomagic can build perfect polygon model and mesh from point cloud data easily and also can convert them into NURBS surface.
In fact, it is necessary to adopt different reconstruction method according to the type of data and the purpose of implication.
The reconstruction scheme is geared to the data source of CT/MRT, and the technical route of reconstruction model is shown in Fig. 1.Obtained the image data after CT/MRI scanning are stored and processed as DICOM format, and then put the data into Mimics.
The other is that the point cloud data is loaded in Imageware software in which characteristic curves and surfaces are gained.
Firstly, the original point cloud data of molar and incisor is gained and optimized in ATOS optic system.
Online since: February 2014
Authors: Shuang Long Liu, Wei Fu Liu, Li Xin Sun
Fractured types are recognized and fractural porosity is calculated by using bilateral logging data.
Then the fractural fractal dimension is calculated by using seismic data to predict the distribution of fractured zone.
To avoid the horizon before unreasonable load layer data model to the fracture system , the application of kriging techniques of data preprocessing [7] , the elimination of outliers , so that faults and horizons to achieve a reasonable match.
The calculation, using known calibration data fitting until the calculated results for the known geological conditions, to predict the unknown before.
Fracture porosity logging data into full play the advantages of high vertical resolution, direct characterization of cracks physical characteristics.
Online since: August 2013
Authors: Ai Ling Qi, Jing Fang Wang, Frank Wang, Unekwu Idachaba, Gbola Akanmu
Principal component analysis (Principal Component Analysis, PCA for short)[6] is an important feature extraction method, with it high-dimensional data is projected onto low-dimensional space via linear transformations, so as to achieve the purpose of noise reduction and redundancy.
If the cumulative contribution ratio is greater than 85%, then the first m principal components can represent most of the information of the original data, not only reduces the number of dimensions, but also minimize the loss of original data information.
Composition of the experimental data sets Data sets name Number of the samples Number of the categories Original dimension Dimension after PCA Iris 150 3 4 2 Weld_defect 200 4 10 4 In this paper, using Principal Component Analysis technology to reduce the dimension of data, so as to achieve the effect of data compression to reduce the amount of computation, set the parameter to explain the degree in threshshold=90.
In order to verify the effectiveness and feasibility of the KNN algorithm, respectively do the classification study of the standard Iris data set and welding defect ultrasonic signal data set, as shown in “TAB.
Wherein, Iris data set is the feature vectors constituted by the three kinds of different types of irises.
Online since: June 2021
Authors: Nathan Nachandiya, Adeyemi Abel Ajibesin, Benisemeni Francis Dama, Olumide Babatope Longe
[15] proposes a framework for mHealth data security on Android systems.
This is done using a system that marks the sensitive data.
The framework taints the data when it is stored and then monitors the data flows in the device.
As the complexity of the data processed by the applications has increased, the amount of storage required to process this data needs to increase too, as seen in[25].
[7] ITU: Key statistical highlights: ITU data release June 2012.
Online since: September 2013
Authors: Hai Ying Chen, Shao Wei Wang, Hui Guo Yue, Yuan Liang, Qing Dang Qiao, Rui Ying Wang
SWAT model database construction Spatial database The spatial data include ground elevation data, land-use types and their distribution, and soil types and distribution.
Ground elevation data using 1:250,000 DEM data, soil types and their distribution by using the Nanjing Institute of Soil 1:100 million Chinese soil maps.
Finally spatial data need to unify projection coordinates.
The precipitation data are from the 38 rainfall stations in entire basin.
Calibration phase of this study, using data from 1974 to 1978, the validation phase using data from 1979 to 1982, and 1973 data is used to warm the model.
Online since: July 2011
Authors: Xiu Juan Wu, Zhong Shi Jia, Hong Jun Ni, Jing Jing Lv, Xiao Feng Wan, Peng Peng Wen, Ming Yu Huang
Fig.3 Point cloud data obtained by contact measure Fig.4 Point cloud data obtained by non-contact measure Data preprocessing Pretreatment of point cloud data is an important element in reverse engineering, it also plays a direct impact on the quality of post-model reconstruction.
The purpose of data preprocessing is to treat the corresponding measurement data to obtain a high quality point cloud data in the process.
The pretreatment of point cloud usually includes: multi-angle point cloud merging, remove noise, reduce noise points, data reduction, point cloud block and so on.
Noise point identificate, removal As a result of the contact and non-contact measurement method of combining data collection, the sum of data obtained is very large, so it need to do some necessary processing before reconstructing models of point cloud data.
Curve is the basis of the data block, and surface is the basis of Modeling.
Online since: May 2014
Authors: Hui Qiang Wang, Xiao Liang, Hong Wu Lv, Fang Fang Guo
This system includes multi-source heterogeneous data source module, data preprocessing module, datamining module, rule matching module, situationassessment module and trend forecast module. 1) Multi-source heterogeneous datasource module: the multi-source data are acquired by multiple sensors,including the audit logs,the application log, checksum data and network packets. 2) Data preprocessing module:this module includes the process of data cleaning, data integration,data extraction,and data conversion. 3) Data mining module:this module uses PFT-Apriori algorithm for mining association rules. 4) Rule matching module:this module does real-time matching according to the rule base, and sends the matching results to the situation assessment module. 5) Situation assessment module:this module sends the assessment results back to the rule base, realizes the dynamic update of rule base, whlie the rule base sends updated rules feedback to the rule mining module, implement the consitency of
Figure1.Cloud security situation awareness model based on Data Mining Association rules analysis A.
Association Rule Association rules [3][7]is an important algorithm in data mining,which is used to find the dependent or associated issues in the data sets.
Filtering infrequent itemsets reduces the digits of the binary data and compresses storage space; Using the binary form reduces storage space and I/O operation times; Data segmentation reduces the index layers and improve the query speed.
Sodh,“Improving Efficiency of Apriori Algorithm Using Transaction Reduction,” International Journal of Scientific and Research Publications, Vol.3, Issue.1, January 2013, pp.1-4
Online since: January 2013
Authors: Hao Xiong Feng, Wei Jian Yi
Known from many experiment datas, mechanics action of steel tube confined concrete absolutely has characteristic of triaxial compression, that is: when close to limit loads, volume of concrete increases sharply and exceeding original volume, the value of limit loading capability and compression distortion grows along with growth of confinement index().
reduction factor considering the effect of slenderness ratio on limit strength. reduction factor considering effect of eccentricity ratio on limit strength. which, can be seen that when ,, no reduction, but when , , as the slenderness ratio increased, gradually decreasing.
Online since: August 2014
Authors: Ana Maria Smaranda Florescu, Georgeta Bandoc, Mircea Degeratu
This is necessary because we do not always have a lots of meteorological datas.
For the determination of these reports (ER) it used different kinds of energies calculated for a period of six years, hourly, daily and monthly data.
Pollution reduction, efficient use of energy resources, energy conservation are key to sustainable building finish [2].
With the measured raw data shows the distribution frequency of wind speed gears as a histogram and curves corresponding Rayleigh distributions , Weibull and Johnson SB [8].
Online since: February 2014
Authors: Lin Hong Xu, Shu Lei Chen, Bo Lei, Xiao Shuang Rao
The results from this simulation can provide some useful information for choosing reasonable cutting edge and machining data to improve surface integrity and prolong cutting tool life in micro-milling operation.
Micro-milling Cutter and Machining Data Micro Milling Cutter In the micro-milling, the diameter of micro milling cutter with two cutting edges is usually between tenths of a millimeter to a few millimeters.
Since there is difference in essence between micro-milling and the conventional machining, the choice of machining data should be take consideration again.
The machining data of the micro-milling cutting process simulated is as follows: Speed 50.265 m/min, Feed Rate 0.12 mm / r, Cutting Depth 0.1mm.
Since manufacturing of micro milling cutter is much more difficult than conventional, so a reasonable choice of machining data to improve tool life has a practical significance for micro machining.
Showing 14211 to 14220 of 40694 items