Sort by:
Publication Type:
Open access:
Publication Date:
Periodicals:
Search results
Online since: June 2013
Authors: Feng Hua Guo
A new linear prediction-based haptic data reduction technique is presented.
Experiments prove the effectiveness of the proposed approach in data reduction rate.
The objective is to enable a high data reduction performance.
Perception-based data reduction and transmission of haptic data in telepresence and teleaction systems.
Prediction-based haptic data reduction and transmission in telementoring systems.
Experiments prove the effectiveness of the proposed approach in data reduction rate.
The objective is to enable a high data reduction performance.
Perception-based data reduction and transmission of haptic data in telepresence and teleaction systems.
Prediction-based haptic data reduction and transmission in telementoring systems.
Online since: July 2014
Authors: Zhi Hao Peng, An Sheng Deng, Wei Luo
Data Processing in Attributes Reduction Based on Rough Sets and FCA
Zhihao PENG 1,2,a, Wei Luo1,2,b, Ansheng Deng1,c
1 Faculty of Information Science and Technology, Dalian Maritime University, Dalian 116024, China
2 Department of Computer Science, Dalian Neusoft Institute of Information, Dalian 116626, China
apengzhihao@neusoft.edu.cn, bluowei@neusoft.edu.cn, c ashdeng@dlmu.edu.cn
Corresponding author.
Introduction Rough set theory(RST) which is proposed by Pawlak is a relatively new soft computing tool to conceptualize and analyze various types of data.
Nowadays, it has been a widely used mathematical framework for image processing, feature selection, pattern recognition, rule extraction, conflict analysis, granular computing, decision supporting, data mining and knowledge discovery from given data sets and so on.
In other words, people can select useful features from a given data set based on rough set theory.
Concept data analysis: Theory and applications.John Wiley & Sons.
Introduction Rough set theory(RST) which is proposed by Pawlak is a relatively new soft computing tool to conceptualize and analyze various types of data.
Nowadays, it has been a widely used mathematical framework for image processing, feature selection, pattern recognition, rule extraction, conflict analysis, granular computing, decision supporting, data mining and knowledge discovery from given data sets and so on.
In other words, people can select useful features from a given data set based on rough set theory.
Concept data analysis: Theory and applications.John Wiley & Sons.
Online since: January 2010
Authors: Cong Feng Jiang, Ying Hui Zhao, Jian Wan
Conventional power reduction techniques such as dynamic
voltage/frequency scaling (DVS/DFS) have disadvantages when they are ported to current data
centers with virtualization deployments.
Moreover, power reduction is particularly valuable for data centers with thermal emergencies, cooling constraints, and power-supply constraints.
Since data centers experience large periods of low utilization, they present opportunities for power reduction with minimal performance losses.
The input and output is the real-time power consumption data.
This paper also presents the challenges of power reduction in large Data Centers.
Moreover, power reduction is particularly valuable for data centers with thermal emergencies, cooling constraints, and power-supply constraints.
Since data centers experience large periods of low utilization, they present opportunities for power reduction with minimal performance losses.
The input and output is the real-time power consumption data.
This paper also presents the challenges of power reduction in large Data Centers.
Online since: February 2011
Authors: Yan Kun Li
Construction Method of Fuzzy Attribute Information System Based on DEA and Attribute Reduction
Li Yankun
College of Light Industry, Hebei Polytechnic University, TangShan, Hebei, China
lyk_917@163.com
Keywords: fuzzy attribute information system; attribute reduction; data envelopment analysis; relative efficiency
Abstract: For the subjectivity of attribute value in the fuzzy attribute information system, this paper proposes a construction method of fuzzy attribute information system based on DEA (Data Envelopment Analysis).
We should make the continuous data discretely firstly before reducing.
Data Envelopment Analysis [4] evaluates the relative efficiency of decision making units (DMU) with multiple input and multiple output by mathematical programming model.
Then we can build the DEA model for the original data and evaluate the relative efficiency for elements.
Data Envelopment Analysis [M].
We should make the continuous data discretely firstly before reducing.
Data Envelopment Analysis [4] evaluates the relative efficiency of decision making units (DMU) with multiple input and multiple output by mathematical programming model.
Then we can build the DEA model for the original data and evaluate the relative efficiency for elements.
Data Envelopment Analysis [M].
Online since: March 2017
Authors: Alinda Samsuri, Mohd Ambar Yarmo, Mohamed Wahab Mohamed Hisham, Mohd Nor Latif
It seem that, increasing the temperature cause the rapid reduction that correlated with thermodynamic consideration data that show the formation of metallic molybdenum is become feasible by increasing the temperature.
Thermodynamic data were used to discuss possibility of the reduction stages in terms of Gibbs-Helmholtz relationship as below.
For second stage reduction, non-isothermal reduction mode continued until 900 °C.
Table 1 shows the thermodynamic considerations data for the reduction of MoO3 by using hydrogen as a reducing agent.
This result is accomplished with the thermodynamic consideration data in Table 1 that indicated the formation of molybdenum is required high temperature to make the reaction become favourable.
Thermodynamic data were used to discuss possibility of the reduction stages in terms of Gibbs-Helmholtz relationship as below.
For second stage reduction, non-isothermal reduction mode continued until 900 °C.
Table 1 shows the thermodynamic considerations data for the reduction of MoO3 by using hydrogen as a reducing agent.
This result is accomplished with the thermodynamic consideration data in Table 1 that indicated the formation of molybdenum is required high temperature to make the reaction become favourable.
Online since: February 2014
Authors: Dong Xie, Hua Bin Wang, Di Jian Xue, Jin Liang Shi
By a scientific experiment or observation, the amount x and Y in a set of data pairs are obtained.
Least squares fitting algorithm based on orthogonal function By analyzing the variation of experimental data points, a curve fitting equation can be created for the target data.
But if is the orthogonal family of functions for the weight of the set of data points, i.e
In the use of data computation, there is often a good language program that has been prepared for the user. 4.
An electronic scale is available for real-time measurement of the mines weight in the reduction furnace, which is then sent into the upper computer for data processing.
Least squares fitting algorithm based on orthogonal function By analyzing the variation of experimental data points, a curve fitting equation can be created for the target data.
But if is the orthogonal family of functions for the weight of the set of data points, i.e
In the use of data computation, there is often a good language program that has been prepared for the user. 4.
An electronic scale is available for real-time measurement of the mines weight in the reduction furnace, which is then sent into the upper computer for data processing.
Online since: December 2012
Authors: Ying Huan Zou, Jing Jie Chen
Further improve the reliability of the results of data analysis.
Reduction of decision table An important part of the Reduction of decision table is the simplification the condition attributes in the decision table, The reduction of decision table and decision table without reduction has the same features, but the reduction of decision table with fewer condition attributes, reflecting the real core attributes of the data sets, For this experiment, is to reflect the use of QAR Data Set fuel consumption to estimate the required core attributes.
QAR Data property of a model reduction experiment, extracted from the experimental data, and laid a good foundation for the next step to establish the fuel consumption estimation model.
Jet fuel estimation model based on flight data analysis J.
Data preprocessing for data mining system design and realization D Beijing: Beijing Jiaotong University, 2011
Reduction of decision table An important part of the Reduction of decision table is the simplification the condition attributes in the decision table, The reduction of decision table and decision table without reduction has the same features, but the reduction of decision table with fewer condition attributes, reflecting the real core attributes of the data sets, For this experiment, is to reflect the use of QAR Data Set fuel consumption to estimate the required core attributes.
QAR Data property of a model reduction experiment, extracted from the experimental data, and laid a good foundation for the next step to establish the fuel consumption estimation model.
Jet fuel estimation model based on flight data analysis J.
Data preprocessing for data mining system design and realization D Beijing: Beijing Jiaotong University, 2011
Online since: June 2010
Authors: Guang Bin Wang, Liang Pei Huang
In the noise reduction algorithm based on manifold learning, phase space data may be
distorted and reduction targets are chosen at random, it made efficiency and effect of noise
reduction lower.
The process of global array by affine transformation will be replaced with mean reconstruction, and the intrinsic dimension was estimate as dimension of reduction targets by using maximum likehood estimation method, the data in addition to intrinsic dimension space will be eliminated.
The algorithm has better effcet by taking the appropriate parameters, but data may be distorted in the prosess of dimension reducation, global array and signal reverse solution, and lead to low effect of noise reduction.
Conclusion In the LTSA noise reduction algorithm,reduction target dimension are chosen at random and phase data may be distorted in the prosess of dimension reducation, global array and signal reverse solution, it made efficiency and effect of noise reduction lower. we propose local tangent space mean reconstruction algorithm.
Firstly the intrinsic dimension was obtained by MLE, then high dimensional data in the phase space was reduced to the intrinsic dimension space, at last the data in local tangent space were returned to the global high dimension phase space.
The process of global array by affine transformation will be replaced with mean reconstruction, and the intrinsic dimension was estimate as dimension of reduction targets by using maximum likehood estimation method, the data in addition to intrinsic dimension space will be eliminated.
The algorithm has better effcet by taking the appropriate parameters, but data may be distorted in the prosess of dimension reducation, global array and signal reverse solution, and lead to low effect of noise reduction.
Conclusion In the LTSA noise reduction algorithm,reduction target dimension are chosen at random and phase data may be distorted in the prosess of dimension reducation, global array and signal reverse solution, it made efficiency and effect of noise reduction lower. we propose local tangent space mean reconstruction algorithm.
Firstly the intrinsic dimension was obtained by MLE, then high dimensional data in the phase space was reduced to the intrinsic dimension space, at last the data in local tangent space were returned to the global high dimension phase space.
Online since: December 2004
Authors: X.D. Zhang, C. Wang, Chang Ku Sun, S.H. Ye
A data reduction method is presented based on Grid reduction method
considering color-boundary preservation.
Therefore, the research work of this paper provides a preprocessing flow, including data arrangement, noise filtering, smoothing, data reduction, division, reorganization and so on.
The noise filtering and data reduction methods are specially analyzed in detail.
Materials Science Forum Vols. *** 717 The research of preprocessing places emphasis on the data reduction considering the over-crowed point cloud data.
Data Reduction realizes the sampling of point cloud, and reduces the quantity of data.
Therefore, the research work of this paper provides a preprocessing flow, including data arrangement, noise filtering, smoothing, data reduction, division, reorganization and so on.
The noise filtering and data reduction methods are specially analyzed in detail.
Materials Science Forum Vols. *** 717 The research of preprocessing places emphasis on the data reduction considering the over-crowed point cloud data.
Data Reduction realizes the sampling of point cloud, and reduces the quantity of data.
Online since: September 2014
Authors: Li Bo Zhou, Su Hua Liu, Fu Lin Xu
The Research of Point Cloud Data Processing Technology
Libo ZHOU1, a, Fulin XU1, b, Suhua LIU1, c
(1Advanced Vocational Technical College, Shanghai University of Engineering Science, Shanghai, China, 200437)
aVincent366@126.com, bfulin_xu@163.com, cLiush968@126.com
Keywords: Coons Surface, Filtering Noise, Data Reduction, Data Smoothing, Data Block
Abstract: Data processing is a key to reverse engineering, the results of which will directly affect the quality of the model reconstruction.
Data reduction We cannot use laser scanning data directly, otherwise, it not only cause a slow computer speed and storage, or even impossible to generate surfaces, or generate distorting surfaces.
Data reduction is to reduce the large amount of redundant data in the data exists, does not generate a new point.
Sometimes, there is a problem data reduction useful data will be lost , especially sharp angle , curvature of the ridge line and changes the data area , to be processed according to their body condition . 3.
The data in NX is shown in Fig.7.a, contour point cloud data is shown in Fig.7.b, data block grid point cloud data is shown in Fig.7.c, profile curve is shown in Fig.7.d.
Data reduction We cannot use laser scanning data directly, otherwise, it not only cause a slow computer speed and storage, or even impossible to generate surfaces, or generate distorting surfaces.
Data reduction is to reduce the large amount of redundant data in the data exists, does not generate a new point.
Sometimes, there is a problem data reduction useful data will be lost , especially sharp angle , curvature of the ridge line and changes the data area , to be processed according to their body condition . 3.
The data in NX is shown in Fig.7.a, contour point cloud data is shown in Fig.7.b, data block grid point cloud data is shown in Fig.7.c, profile curve is shown in Fig.7.d.