Search:

  • Data Reduction

Search Options

Sort by:

Publication Type:

Open access:

Publication Date:

Periodicals:

Search results

Online since: October 2013
Authors: Tsair Rong Chen, Wen Chin Lin
The researcher revises the questionnaire, practices questionnaire survey, analyzes data and selects 685 university students by stratified cluster sampling.
Data obtained are analyzed by descriptive statistics, independent sample t test, One-way ANOVA and structural equation model.
Data Processing (1) Cognition of energy saving and carbon reduction There are 17 items of “cognition of energy saving and carbon reduction”.
Path analysis of cognition and attitude toward energy saving and carbon reduction Regarding samples collected by formal test and by SEM program Amos 17.0, this study probes into relationship between students’ cognition and attitude toward energy saving and carbon reduction in order to validate the fit between theoretical model and actual data.
,Multivariate Data Analysis (5th ed.) , Englewood Cliffs, NJ : Prentice-Hal(1998)
Online since: May 2013
Authors: Feng Chang Xue
Applications showed that geomagic can, from the original point cloud process, find out useful point cloud, delete speckles, and through the methods of resampling and noise reduction processing, improve cloud quality and reduce data deviation thus improving the data processing efficiency.
Merging cloud data.Merging cloud data refers to merge scanning data to form a point cloud object, which will become a full point cloud data.
Noise reduction can make data smoothing and reduce deviation and make the point cloud data in modeling to better object true shape.
Encapsulating data.Packaging data refers to encapsulation calculating the point cloud data, changing the point cloud data into polygon model data.
Geomagic can find out useful point cloud, delete speckles, and through the methods of resampling and noise reduction processing, improve cloud quality and reduce data deviation thus improving the data processing efficiency.
Online since: September 2013
Authors: Wen Jing Zhao, Qiu Na Zhang, Ai Min Yang, Na Ji
Introduction Z.Pawlak, a Poland mathematician, has put forward Rough Set Theory a kind of new mathematics data analysis theory in 1982.
So, Rough Set is widely used in machine study, knowledge finding, knowledge acquisition, decision analysis, data mining, pattern-recognition, and so on, So far, Rough Set Theory already becomes the focus-studied in international artificial intelligence.
In this paper, on the basis of rough set theory about data reduction, using granular computing for knowledge reduction.
Export: A attribute reduction.
Thus the reduction is.
Online since: July 2011
Authors: Cheng Gen Dong, Qi Shan Zhang, Jin Pei Wu
Fingerprint algorithm from the above point of view, that is, a group from the original text to the hash value (integer array) conversion, and telecommunications applications that are more similar to the data validation.
Therefore, this paper used data validation algorithm to calculate the fingerprint telecommunications network events.
Cyclic Redundancy Check (CRC) is a use of a simple binary polynomial division to get the data validation and the method of cyclic redundancy check (CRC) is a popular ensure that the data (eg, file) method is not damaged[4-6].
Network operators are now taking a hundred thousand actual data, the text mode reduction and reduction of the cost fingerprint algorithm compared the event to get the following table (unit: ms): Table 3 Performance Comparison of reduction Process Reduction method Fingerprint calculation Reduction of non-fault events Reduction of repeating events The total time Text 0 462 442 904 fingerprint 731 33 32 796 From the Table 3, fingerprints method than text mode network events in the telecommunications part of the reduction, improved performance of 11.95%.
References [1] RFC3309, Stream Control Transmission Protocol (SCTP) Checksum Change[S], J.Stone, R.Stewart, D.Otis, September 2002 [2] RFC1321, The MD5 Message-Digest Algorithm[S], MIT Laboratory for Computer Science and RSA Data Security, Inc.
Online since: December 2013
Authors: Lian Ying Zhang, Si Lu Zhang
This paper presents a practical applicationof Eco-costsModel in evaluating different emissions reduction measures on construction equipment.We propose acomparison model withthe input data of emission reduction percentageand Eco-costs of emissions.
Fig.1 Eco-costs of Air Pollution on Construction Site Input data of Eco-costs Model.
(Exchange rate are based on the data in October, 2012) Table 1 Eco-costs of Emissions per unit (2012) Pollutants/Sources CO HC NOX PM Diesel Euro[€/kg] 0.240 3.727 5.290 27.44 / RMB [¥/kg] 1.965 30.52 43.32 224.7 8.500 Construction of the Comparison Model As techniques of combustion chamber cleanup need a substantial structural change of diesel engine, it is unfit for applying Eco-costs Model.
∆Pm=βm*fm*I+j=1n∆rnm*qn*cn (1) fm= 1-im*H/hmI*t, ifalternative fuel is used 0, otherwise (2) ΔPm: Price reduction of Measure m, doing a kilowatt-hour work f (m): Price adjustment coefficient of alternative fuel in Measure m H:Heat value ofdiesel, MJ/kg hm: Heat value of alternative fuel in Measure m, MJ/kg I:Price of diesel per kilogram im: Price of alternative fuel per kilogramin Measure m t:A kilowatt-hour equivalent diesel, equals to 0.084kg βm:Proportion of alternative fuel in Measure m Δrnm: Emission Reduction Percentage of Pollutant n in Measure m qn:Emission Quantity of Pollutant n ifno measure is taken, g cn:Eco-cost of Pollutant n Data Collection and Calculation Six measures are analyzed in this article.
Others usetechniques of alternative fuel, including Dimethyl Ether (DME),Biodiesel (B100), andLiquefied Natural Gas (LNG).Emissions Reduction Percentage are based on technical testing authority in China[8, 9], and Emissions Quantity are considered as Emission Standards of European Union (Stage IV), which is approved by Chinese government for a long-term strategy.Detailed data and results are listed in Table 2, where the functional currency is Chinese Yuan.
Online since: September 2011
Authors: Long Zhen Duan, Zhi Xin Zou, Gui Fen Wang
The Web Classification Based on ROUGH-GA-BP Long Zhen Duan1,a , Zhi Xin Zou1,band Gui Fen Wang1,c 1Department of Computer Application Technology, Nan Chang University, Jiang Xi , China alzhduan@126.com,bzzhxin@163.com, cfenfen353@126.com Keywords: Text Classification Algorithm, ROUGH-GA-BP, Data Reduction Abstract.
This algorithm reduces the data of the text input vector by the data reduction method based on rough sets theory, and presenting a genetic algorithm approach for feature selection.
Introduction Because of the rapid growth of text data, automatic methods of data management are especially important.
In document [1], using rough set theory reduction methods to pre-treat the information, and remove redundant data; taking reduced decision tables as design basis and training data of the neural network.
Test and Analysis Corpus used in this study is Chinese web pages data set CWT100g collected by Peking University's network laboratory.
Online since: May 2011
Authors: Wan Zheng Ai, Bai Gang Huang
The energy loss coefficient of sudden reduction tube flows is an important index of sudden reduction tube.
An empirical expression, which was verified by comparison with other experiment data, was presented to calculate the energy loss coefficient of sudden reduction tube flows. 1 Introduction Sudden reduction tube was broadly used in drainage pipes and discharge tunnel, such as orifice plate discharge tunnel and plug discharge tunnel.
Figure 2 is drawn by using the data in table 2, which demonstrates that when Re is constantly, the energy loss coefficient ξ decrease with the increase of contraction ratio β. ξ is closely related with La and Lb.
Figure 3 is drawn by using the data in table 2, which demonstrate that vortices are the important regions of the energy dissipation.
Table 1 Variations of ξ with Re (β = 0.50) R (×105) 0.90 1.80 9.20 18.40 27.60 ξ 10.75 10.92 10.92 10.92 10.92 Table 2 Variations of backflow length and ξ with β(Re = 1.8×105) β 0.4 0.5 0.6 0.7 0.8 La 0.17 0.12 0.09 0.07 0.05 Lb 0.45 0.41 0.39 0.34 0.32 ξ 31.65 10.92 4.11 1.46 0.47 Figure 2 Variations of ξ with β Figure 3 Variations of ξ with La and Lb 4 Verification Figure 4 is the comparison results between the simulation data in this paper and the data presented by Liu[6](2002), which implies that the conclusion obtained by simulation in this paper is very close to that by Liu[6](2002).
Online since: January 2015
Authors: Qing Chao Jiang
The Research of High-dimensional Big Data Dimension Reduction Strategy Qingchao Jiang Hebei Software Institute Baoding Hebei China 1358206@sina.com Keywords: High dimensional data, Data mining, DE, Lasso, Lars.
The huge amount of data is often described as "big data".
(2) Data processing Due to the dimensions of the data is very large, so the data processing uses grouping strategy, eigenvectors can be divided into T interval, named Xi.
The purpose of this article is for high-dimensional data dimension reduction, so choose Lars algorithm to compare the non-zero elements contained in the Ei(t+1) and Gi(t) to measure the fitness of individual variation and parent.
The experiment choose UCI database the data set is means.
Online since: September 2011
Authors: Jing Zhang, Hong Guang Jia, Ling Ding, Wen Hui Dong
Data testing is very important for the development of new missile.
However, the Shock isolation of the mechanical structure is important to improve system reliability and rationality and directly affects the success of data recovery [1].
Buffer and Vibration Reduction of Original Structure Design The original FEA model of the data recorder is shown in Figure 1: Fig. 1 The FEA model of the data recorder Fig. 2 The FEA model of target and projectile Figure 1 is the picture of 1 / 2 axial symmetry storage, where 1 is the case, 2 is the buffer material and 3 is storage board.
Buffer and Vibration Design of the data recorder structure according to the stress wave theory, and reasonable allocation of the generalized impedance ratio, in order to reduce data storage than the maximum dynamic stress chip as the goal, set up a missile penetration of the finite element model of joint Isight, with Ls-dyna complete data packet buffer vibration reduction structure optimization design.
Marshall Hammer Experiment Figure 9 for data recorder physical composition, the figure 10 for Marshall Hammer overload curve: Fig. 9 The data recorder Fig. 10 The overload curve of data recorder Conclusion Using the stress wave theory success in solving the narrow space impact resistant problems, make the dynamic stress of protected component down about 60.9%, and through the Marshall hammer test success.
Online since: March 2009
Authors: Evgeny N. Selivanov, R.I. Gulyaeva, A.N. Mansurova
The Netzsch Thermokinetics program was used for the analysis of the experimental data.
Experimental data satisfy a kinetic mode of the СаFеSО reduction process by carbon monoxide.
Fig. 5 - Polyterms of Са2FеСuSО3 sample reduction in a flow (75 % CO - 25 % Ar) (a) and the results of processing of the experimental data by the Friedman method (b).
According to the data [16] they are within the limits of 108.8 - 156.6 kJ/mol.
Summary The obtained data on the kinetics of СаFеSО and Са2FеСuSО3 oxysulfides reduction by carbon monoxide are evidence of differences in the processes mechanisms.
Showing 71 to 80 of 40196 items