Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: May 2020
Authors: Yan Jiao Zhang, Yi Liu, Chen Wang, Gang Wen, Shi Wei Ren
Study Method and Data Definition of Product Category.
Table 1 Data inventory of ceramic tile production Item Substance Unit Qty.
(1) Designating background data 1% of the contribution value to the environmental impact is taken as a “rule of choice”.
Process excluded 1 Field emission 1 Diopside production 2 Use of electric power 2 Use of feldspar 3 Use of diesel oil 3 Talc production 4 Use of soft coal 5 Use of natural gas 6 Water-sorted soil production 7 Feldspar sand production 8 Clay production 9 Quartz sand production 10 Road transport (2) Determining enterprise’s on-site process data Based on the above background process included, the enterprise shall collect corresponding on-site process data and raw material consumption data.
Collection range of the enterprise’s on-site process data as finalized is shown in Table 4.
Online since: February 2019
Authors: Young Hoon Moon, Gyeong Uk Jeong, Chung Gil Kang, Jun Park, Chul Kyu Jin
In addition, rigid-plastic finite element simulation was performed for the high-temperature compression test using the flow strain data by the thermal deformation characteristics test.
An increase in the temperature increased the similarity between experimental data and finite element analysis results.
Table 2 summarizes the data of Eq(1), which is extremely similar to the test results in each temperature zone based on the true strain rate.
Additionally, rigid-plastic finite element analysis was performed with the material constant data calculated from the high-temperature compression test results.
The following conclusions were obtained: A rigid-plastic finite element analysis was conducted with the material constant data calculated from the experiment.
Online since: November 2012
Authors: Xiang Hong Xue, Xiao Feng Xue, Lei Xu
Principle component analysis has two purposes: the first is data reduction, and the second is for revealing the relations between variables.
It assumes n samples and each sample has p characteristics (index), then the sample data set can be expressed as: (1) Steps of principle component analysis [6]: Step1: data standardization; standardize the original data first so as to eliminate dimension influence, thus gaining standardized data set .
Divide simulation data into two parts and take the data from 1990 to 2005 as training sample set data; data of 2006 to 2010 as test sample set data for the model; in order to eliminate influence to the prediction performance resulted from the dimension difference of the SVM prediction indicator data, it’s necessary to preliminarily process the input data before utilizing SVM to establish model for prediction; it will directly influence the training speed and water demand prediction accuracy.
This article conducts standardized processing to all the sample data with Eq.8 and normalizes all the data in interval [0, 1]
(8) refers to normalized data; xi refers to index series data; xmin, xmax respectively refer to the min and max value of the original series data.
Online since: January 2013
Authors: Hong Kyu Kwon, Kwang Kyu Seo
The framework incorporated HLCCM is proposed in cloud computing based collaborative design environment and allows users to estimate the product data and other related information on a wide variety of application.
The product attributes strongly correlated with LCC factors are chosen as learning data in the HLCCM.
The cost factors considered would depend on the stage in which we want to use the model, the kind of information to be extracted from the model, the data available as input to the model and the product being designed.
Table 3: The predicted LCC results with 200 sample data using the HLCCS (MAPE) Conventional ANN HLCCS Training Validation Training Validation Life cycle energy cost 19.32% 28.51% 11.13% 16.17% Maintenance cost 7.14% 11.17% 4.27% 7.02% Average 13.23% 19.84% 7.70% 11.60% In Table 3, the HLCCM has higher prediction accuracy than BPNN for the validation data.
This distributed modeling paradigm is very different from centralized data repositories where users conform to a single standard and are required to check in on a regular basis to make sure their models are updated.
Online since: November 2015
Authors: Kalathingal Ashidh, A. Sumesh, A Santha Kumari, N Rajasekaran
The welding parameters such as arc current and voltage were captured by using data logger and the stick-slip effect was analyzed.
The influence of stick-slip effect on welding is observed in the arc waveforms plotted from the data obtained from the data logger.
Welding Parameters kept constant Wire Feed Speed 7.7 m/min Travel Speed 0.7 m/min Arc Length 2 mm Stick Out Distance 10 mm Shielding Gas Ar 80%+CO2 20% Gas Flow Rate 18 litre per minute TVC data logger The Validation Center (TVC) data logger is an system which is able to record and analyse the welding parameters.
Using TVC data logger, though there were enough parameters which can be measured and analysed like temperature, gas flow rate, travel speed etc., for analysis the voltage, currrent and wire feed speed were sufficient.
Arc waveforms After getting the transient data of arc voltage, current and wire feed speed, the arc waveforms are plotted.
Online since: May 2014
Authors: Saharat Buddhawanna, Kamon Budsaba, Sayan Sirimontree, Krittiya Lertpocasombut, Boonsap Witchayangkoon
The survey data of each criterion is averaged to find the mean of satisfaction score.
The questionnaire surveyed data are analyzed by SPSS indicating the lack of understanding of the construction technique with prefabrication [1].
SPSS® (Statistical Package for the Social Science for Windows) is used to analyze the survey data.
The Levene’s test shows p-value <= 0.05, indicating that the survey data do not follow homogeneity of variance assumption for most criteria.
Welch and Brown-Forsythe tests are conducted when the survey data do not follow homogeneity of variance assumption to compare satisfaction mean score among habitat groups.
Online since: November 2019
Authors: Prasanna Vineeth Bharadwaj, P.S. Suvin, T.P. Jeevan, S.R. Jayaram
Data pre-processing 2.
Missing data: The first step in data pre-processing is to take care of the missing data.
The entire row of the dataset with the missing data can be removed when the missing data are minimal or when a row contains multiple missing data.
Encoding categorical data: The “operations” column under the machining conditions contains categorical data i.e., the data consists of words instead of values, which needs to be encoded into numbers.
Principal Component Analysis (PCA): This is a data manipulation technique used to reduce the dimensionality of the original data [14].
Online since: August 2004
Authors: Tae Eun Jin, Cheol Kim, Heung Bae Park, Chang Sung Seok, Ill Seok Jeong
We performed the neural network training using the 97 measured data in CASS materials by Aubrey and Chopra.
Fig. 2 (b) shows the target value that is ferrite content from the input data.
Therefore, we believe that if the trained neural network described in this study is used, we can effectively predict the ferrite content. 0 5 10 15 20 25 0 20 40 60 80 100 Data number Chemical composion(%wt) C Mn Si Cr Ni Mo N 0 10 20 30 40 50 020406080 Data number Ferrite content (Vol.%) 100 (a) Input data (b) Target data Fig. 2.
Neural network architecture for prediction output and Aubrey equation results of Charpy impact energy 0 100 200 300 400 500 600 700 0 40 80 120 160 200 240 280 Data number Ferrite(vol%), Aging temp( ℃), Aging time(×10 2hr) Ferrite content Aging temp Aging time 0 50 100 150 200 250 300 0 40 80 120 160 200 240 280 Data number Aged Cv (J/cm 2) (a) Input data (b) Target data Fig. 7.
Training data for prediction of Charpy impact energy For Charpy Impact Energy Prediction.
Online since: September 2015
Authors: Sergey A. Evdokimov, Vadim Rifhatovich Khramshin, Aleksey S. Evdokimov
They differ in ways of forming data flows and effective data rates within the communication channels.
The version of the recording systems is selected mainly depending on the performance data of the monitoring items.
Network topology generally defines efficient allocation of data flows among computer-aided tools of the recording system.
This unit receives data flows from industrial computers, organizes a data base and forms data flows of oscillograph visualization for the client computers on their requests [6].
Secondly, the use of the data base as a link between industrial and client computers significantly decreases the main effective rate of the data flows received by clients.
Online since: July 2022
Authors: Mariana Conde, João Henriques, António Andrade-Campos, Sam Coppieters
It is very important to have adequate input data for the FEMU.
Generally, this approach compares experimental with numerical data.
Additionally, the iteratively numerically generated data is not compared directly with the reference material data.
Again, synthetically deformed images were created iteratively based on the numerical analysis, being the numerical data a DIC-leveled numerical data.
The superscripts “num” and “exp” refer to the data originated iteratively during the optimization process and the virtual experimental (reference) data, respectively.
Showing 21251 to 21260 of 40373 items