Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: December 2004
Authors: X.D. Zhang, C. Wang, Chang Ku Sun, S.H. Ye
A data reduction method is presented based on Grid reduction method considering color-boundary preservation.
Therefore, the research work of this paper provides a preprocessing flow, including data arrangement, noise filtering, smoothing, data reduction, division, reorganization and so on.
The noise filtering and data reduction methods are specially analyzed in detail.
Materials Science Forum Vols. *** 717 The research of preprocessing places emphasis on the data reduction considering the over-crowed point cloud data.
Data Reduction realizes the sampling of point cloud, and reduces the quantity of data.
Online since: December 2014
Authors: Xing Qi He, Qi Jun Xian, Ping Yan
Research of Loss Reduction Efficiency Based on DEA HE Xing-qi, XIAN Qi-jun, YAN Ping State Grid Sichuan Electric Power Company, Chengdu, China Hayes_goodman@hotmail.com,13808185225@139.com, 13808089870@139.com Key words: line loss management; energy conservation; data envelopment analysis Abstract:An evaluation system of loss reduction efficiency was established for power supply companies by using data envelopment analysis methods,and the empirical demonstration research was conducted for loss reduction efficiency of power supply companies in Sichuan.The evaluation results indicate that Institutional reform, grid construction and informationization promote the loss reduction efficiency of the power supply companies in Sichuan.
Moreover, the low line loss rate cannot represent the better management level and loss reduction efficiency.
Therefore, an objective method to evaluate the loss reduction efficiency for power supply enterprises is very exigent.In this view, according to the statistical data of 2012-2014 of 21 power supply companies of Sichuan Power Grid Company, an evaluation system of loss reduction efficiency was established for power supply companies based on the variable returns DEA model in this article.
It indicated that the enforcement of loss reduction of some power supply companies was still to be improved, and not produce the scale effect.
Conclusions An evaluation system of loss reduction efficiency was established for power supply companies by using data envelopment analysis methods, and used the panel data of 21 powe supply companies in Sichuan as an example to quantitative evaluate and analysis, the result can provide reference for the development of  investment strategies and loss reduction efficiency evaluation.
Online since: June 2019
Authors: Valentin Avramenko, Alexey Golikov, Arseniy Portnyagin, Evgenii K. Papynov
Using a spline, one can define temperature function of generalized constants relying on experimental data without any model assumptions.
Data on kinetic parameters of iron oxide reduction reveals rather broad range of values (18-246 kJ mol-1 [9]) caused not only by different composition and morphology of investigated materials but also by inaccurate kinetic analysis.
Multiple heating rate data were used to improve reliability of kinetic analysis according to ICTAC Project recommendations [10].
Avramenko, An alternative approach to kinetic analysis of temperature-programmed reaction data, RSC Adv. 8 (2018) 3286–3295. doi:10.1039/c7ra09848k
Galwey, The significance of “compensation effects” appearing in data published in “computational aspects of kinetic analysis”: ICTAC project, 2000, Thermochim.
Online since: August 2013
Authors: Xiao Fang Liu, Chun Yang
Training Data Reduction and Classification Based on Greedy Kernel Principal Component Analysis and Fuzzy C-means Algorithm Xiaofang Liu1, Chun Yang2 1.
Given the data set , the data standardization method in (8) is used to obtain normalized data and balance the data impact on the results of the classification
Feature extraction data for test data Y N Validity of classification?
Results of training data reduction by GKPCA method The GKPCA method uses Radial Basis Functions (RBF) kernel in (9).
Extracted Subsets from Training Data by GKPCA Data Sets Kernel Parameters Samples Number of Subset Percentage of Training Data Reduction (%) Iris 3 22 75.6 Landsat Satellite 8 783 82.3 A.
Online since: July 2014
Authors: Yun Peng, Hong Xin Wan
The evaluation algorithm is based on the attributes of data objects.
There are a lot of uncertain data of customer clustering, so traditional method of classification to the incomplete data will be very complex.
Example Analysis Data objects is shown in Table 1.
Tested by actual data analysis, attributes reduction can reduce the size of customer data and data noise and improve the clustering accuracy of web customers.
Data Analysis Approaches Of Soft Sets Under Incomplete Information.
Online since: October 2014
Authors: Hong Li Lv
Then rough sets theory is applied to the fault diagnosis of power transformer because it can be effectively analysis and deal with imprecise and incomplete data and it can extract of implicit knowledge and effectively simplify data.
Sample data Discrete data Reduce attribute Sample data after reduction Neural network system Test data Test data after reduction Result output Fig. 1 Block diagram of RSNN system The application of attribute reduction algorithm According to DGA method, the composition of hydrogen (H2), methane (CH4), acetylene (C2H2), ethylene (C2H4) and ethane (C2H6) is for the input data of BP neural network.
Fifteen sets of data is selected as the training sample and the original decision table is shown as Table 1.Each set data have eight input data as the condition attributes, denoted by {C1,…{TTP}8230 ,C8}.
The output data is defined as decision attribute, denoted by D.
After data reduction, the BP neural network is rebuild and the training result is shown in Fig. 2.
Online since: November 2011
Authors: Jian Yang Lin, Yong Yu, Ya Jie Xu, Zhou Mi Kan
Get Acanthopanax Senticosus Harms every clustering centre data with liner reduction PSO and Conjugate gradient optimization.
Make Acanthopanax Senticosus Harms criterion data as Gamma distribution and calculate similar by the sample data Normal distribution and ensure the finial similar by angle cosine.
Reduction Introduce In the rough set theory [1] knowledge can be understood an ability of classify that is carving up the data and can expressed by assemble for example if divide the given data assemble U by the equal relationship P then call it knowledge.
The final aim of knowledge reduction is not only for the sake of reducing conditioned property in the repository compressing data but also for the sake of making good basic for knowledge found.
Its centre data of recommended products clustering is 0.0062±0.00003, 0.0754±0.00038 and its centre data of non-recommended products clustering is 0.0127±0.00006, 1.6375±0.00819 and the centre data of counterfeit clustering is 0.0221±0.00011, 1.8280±0.00915.
Online since: March 2011
Authors: Hao Jiang, Jie Xu, Zhuo Chen
Classical linear dimensionality reduction methods are helpless for high dimensional nonlinear input data, thus nonlinear dimensionality reduction methods are needed.
On the other hand, usually the input data of nonlinear dimensionality reduction methods are real data, such as images and videos captured by cameras.
Real data are complex and have noise, so it is hard to process directly.
Real data need preprocessing, such as segmentation and alignment.
To avoid dealing with the real data and obtain useful conclusions, we generate artificially 2D simple motion data for motion analysis.
Online since: October 2013
Authors: Cheng Zhe Xu, Wei Hong Zhu
Dimensionality Reduction.
Hyperspectral data used in this study has a high dimensionality.
Low frequency component generally contain most information of data, thus, the coefficient corresponding to can be describe the original data.
Figure 2 shows the result of dimensionality reduction by using DWT, where we can find that the shape of approximation coefficients data is very similar to original data, this result indicates that the DWT can be used in reducing dimensionality.
(A) 512-dimensional original hyperspectral data; (B) 16-dimensional approximation coefficients data after performing 5-level DWT.
Online since: September 2014
Authors: Zhi Kong, Guo Dong Zhang, Li Fu Wang
The normal parameter reduction in soft set is difficult to application in data mining because of great calculation quantity.
Experience has shown that the method is feasible and fast.. 1.Introduction The nature of the uncertainty data appearing in economics, engineering, environmental science, sociology, medical science, and many other fields with the complexity of uncertain data can be very different.
Parameter reduction is the key problem in soft set.
It would take much time to search the reduction.
In Section 3, normal parameter reduction in soft set is reviewed and the reduction model is given.
Showing 41 to 50 of 40694 items