Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: January 2014
Authors: Ming Yu Li, Bao Lin Li, Hai Hao Liu, Gang Cao, Gang Ren, Lin Song, Xiang Dong Ye
In the dry category, the selective catalytic reduction (SCR) and selective non catalytic reduction (SNCR) were the most widely used [2-3].
This paper presented a new method of “absorption-oxidation-reduction” which used ferrous sulfate solution as absorbent, oxygen as oxidizer and urea as reducer to remove NOX from flue.
This method could provide important reference data for wet denitration, Moreover, it has good application prospects. 2 Experimental Sections 2.1 Experimental apparatus and instruments Experimental apparatus and flow diagram of “absorption–oxidation-reduction” coupling method in Fig.1.
Low-temperature selective catalytic reduction of NO with NH3 over Ce-Mn/TiO2 catalyst [ J].
NO removal by reducing agents and additives in the selective non-catalytic reduction(SNCR) process [J].
Online since: September 2019
Authors: Vladimir Aseev, Victor A. Klinkov, Iuliia Tuzova, Yurii Fedorov
According to the data presented in Table 1, in the near infrared region bismuth in some glass systems may have a broadband luminescence with a full width at half maximum (FWHM) of the order of 250-300 nm at 1.3 microns position.
Therefore, the reduction conditions are needed.
The absorption spectra of samples treated at reduction conditions were also measured.
Those data are in accordance to literature data and the measured Bi near infrared luminescence under 455 nm excitation is shown [2, 3].
Taking into account the data, we can conclude that the luminescence centers have the same nature.
Online since: June 2014
Authors: Farid Nasir Ani, M. Adlan Abdullah, Masjuki Hassan
The dynamometer controller also records data such as the engine speed, torque, power, engine temperatures, throttle pedal position, fuel flow and air flow in real time.
Prior to running the engine for data collection, the engine was warmed up for 20 minutes at 2000rpm and 20% load.
The engine was allowed to stabilize before the data were recorded.
Data for other speeds and loads also showed either slight increase in NOx emissions or no significant change.
Reduction of up to 56% was observed at 40% biodiesel fraction.
Online since: December 2012
Authors: Si Yuan Cheng, Tian Li, Xue Peng Wang, Xue Rong Yang
In this paper, we propose a new method of data repairing based on Anatomical CAD.
Generally, Reverse Engineering refers to the process of reconstructing CAD models from existing parts, which consists of data acquisition, data pre-processing, incomplete data repairing and CAD modeling [1].
When capturing data, measured data of a product can be incomplete because of the inaccessibility or invisibility of the surfaces for the measure tools, which need to take data post-processing and repair the model.
Fig. 3 The scaned point cloud of model Fig. 4 The STL data Generally, much more than necessary points obtained by a Potable CMM would significantly affect the speed of processing data.
After operations of the point cloud Noise Reduction, Sub Sample, and Mesh Data, an STL file is obtained, as shown in Fig. 4.
Online since: December 2013
Authors: Yi Zheng
Data and Results Data Collection.
The data needed by indexes is about 11 coastal provinces and municipalities.
Table 1 gave the correlation matrix between the input and output data.
[2] Quanling Wei. data envelopment analysis.
A slack-based measure of efficiency in data envelopment analysis.
Online since: February 2013
Authors: Zong Hai Sun, Osman Osman
Dimensionality reduction is the mapping of high-dimensional data into some representation of lower dimensionality.
We apply GPLVM method for dimensionality reduction on the data set which is in high-dimensional form, and then we evaluate the quality of the resulting low-dimensional data representation by comparing the results attained to those we got when using intrinsic dimensionality estimation techniques, and lastly compare the performances of GPLVM and PCA on the same data set.
The data set consists of 53 data points with a dimensionality of 87 features.
Now, our data set has been successfully downsized using the above illustrated dimensionality reduction techniques, the next step is to make use of this outcome for effective and fast clustering.
Fig. 3 shows the resulting data clusters.
Online since: May 2013
Authors: Yan Guang Liu
Therefore, staffs of HeShan environmental monitor station started monitoring from 2002, and have accumulated large amounts of observed data.
This paper made statistical analysis of precipitation data from the HeShan environmental monitor station, in order to study the time and space distribution characteristics and the and differences pattern, to provide help and theory for acid rain con. 2.
Results and analysis There are 442 observation samples in nearly nine years 2002-2010, and only some of the monitoring data in 2002, 2003 , not monitoring to acid rain in 2002, 2003, 2009, 2010 years.
The data can be seen, the acid rain in the space distribution and urban appear less, the suburbs appear more.
So, one of the effective ways of managing acid rain is pollution prevention and waste reduction work.
Online since: October 2009
Authors: Chang J. Wang, Stuart Tomlinson, Colin Morgan
Due to uncertainties in the derivation of numerical data and other related information, such as sources of raw materials, the embodied carbon emissions are calculated and analyzed using material emission factors using the Life Cycle Assessment method.
The water industry research strategy has three phases with decreasing priorities based on the reliability of data and information.
Product information was derived from technical data sheets and operating and maintenance instruction manuals.
A level of sparse and/or inaccurate data is an unavoidable feature of such analyses; for example, the sources of raw materials used in identical products from a single manufacturer may vary widely.
Data for the individual processes applied to each component of the pump unit analysed and the CO2 emission developed are not available.
Online since: August 2013
Authors: Peng Hao Zhang
Introducing geodesic distance to replace the original European distance data will make twenty-four dimensional data, which using the traditional MFCC feature extraction down to ten dimensional data.
This paper put the Manifold learning used in speech recognition of the feature extraction, proposes a new MFCC - Manifold feature extraction method, to the low dimensionality reduction is effect, and the characteristic value of the data quantity have done a lot about minus.
Realize the data dimension reduction or about data visualization and obtain appropriate mapping relation.
For convenience of shows that the characteristic value data quantity about loss of direct effect to MFCC feature extraction of characteristic value data quantity as the standard, the type definition: Data ratio = (MFCC - Manifold characteristic value data quantity) / (Mel characteristic value data quantity) TABLE I.
Can be seen from table 1, compared with the traditional MFCC its final recognition rate has, and the data reduction about is above 70%.
Online since: November 2011
Authors: Gauravjeet Singh, Sandeep Bal, Poonamjeet Kaur, Kanwaljit Kaur
Keywords- Data mining, Frequent pattern mining, Association rules Abstract-Frequent pattern mining has been a focused theme in data mining research.
Orlando et al. [13] proposed an algorithm that combines transaction reduction and direct data access.
Efficient Data Mining for Path Traversal Patterns.
Set-oriented data mining in relational databases.
Data Knowl.
Showing 1931 to 1940 of 40694 items