Sort by:
Publication Type:
Open access:
Publication Date:
Periodicals:
Search results
Online since: November 2012
Authors: Li Xin Wang, Ming Yue Guo, Yu Guo
MBD (Model Based Definition) uses data set instead of engineering drawings which brings revolutionary change in the means of production.
The integrated digital model, also known as data set, mainly includes the geometry information and the non-geometry information.
Using the 3D model as the unique source of definition, we can eliminate the ambiguity between the 3D model, paper drawings, and downstream applications requiring data, which will greatly improve the quality and efficiency of engineering.
The crux of the idea was to extend the usage of digital models in the reduction or elimination of 2D drawings[4].
The ASME Y14.41 standard includes several parts: general, data set identification and control, data set requirements, design model requirements, common requirements for product definition data, notes and special notations, model values and dimensions, plus and minus tolerances, datum applications and geometric tolerances[5].
The integrated digital model, also known as data set, mainly includes the geometry information and the non-geometry information.
Using the 3D model as the unique source of definition, we can eliminate the ambiguity between the 3D model, paper drawings, and downstream applications requiring data, which will greatly improve the quality and efficiency of engineering.
The crux of the idea was to extend the usage of digital models in the reduction or elimination of 2D drawings[4].
The ASME Y14.41 standard includes several parts: general, data set identification and control, data set requirements, design model requirements, common requirements for product definition data, notes and special notations, model values and dimensions, plus and minus tolerances, datum applications and geometric tolerances[5].
Online since: July 2015
Authors: Norzila Othman, S. Abdul-Talib, A. Yassin
Growth and substrate depletion data may fit to a simple Monod model as stated in Equation 1.[1], [2].
The experimental data consisted of initial substrate concentration over a given sampling period.
The fourth order Runge – Kutta method was used for fitting the Monod equation to the experimental data.
The model was calibrated using five sets of data and example of data as shown in Table 2.
The lowest average of error squares show the best data set that predic the fitting values.
The experimental data consisted of initial substrate concentration over a given sampling period.
The fourth order Runge – Kutta method was used for fitting the Monod equation to the experimental data.
The model was calibrated using five sets of data and example of data as shown in Table 2.
The lowest average of error squares show the best data set that predic the fitting values.
Online since: January 2012
Authors: Jia Li, Jia Xin
Because INSAR can image in day and night and under all weather, researching on INSAR data processing has became very important in this area.
Meanwhile, the accuracy of this step is crucial to the reliability of subsequent image processing and final results of the data processing chain, meeting sub-pixel precision request.
Flat earth removal is an important step of INSAR data processing
So the technique of flat earth removal is an important step in data processing of INSAR and its accuracy effects on the precision of the DEM.
Interferometric SAR Phase Noise Reduction [J].
Meanwhile, the accuracy of this step is crucial to the reliability of subsequent image processing and final results of the data processing chain, meeting sub-pixel precision request.
Flat earth removal is an important step of INSAR data processing
So the technique of flat earth removal is an important step in data processing of INSAR and its accuracy effects on the precision of the DEM.
Interferometric SAR Phase Noise Reduction [J].
Online since: January 2012
Authors: Qi Wang, Yong Xue Zhang, Shen Gao
This structured DFT methodology could get an optimal compromise between test coverage and test cost such as chip area, pin numbers, test vectors data size, test time and so on.
DFT design flow When running the memory testing, the data, addresses and control signals applied to RAM are generated by the BIST and then the BIST comparator checks data read from RAM and generates pass/fail signals.
In order to increasing the fault detection, we use two data backgrounds on this algorithm, all 0s and 01s interleave.
These scan chains are too long, which means the volume of scan test data will exceed the ATE’s data volume and the test time will be too long to be accepted.
By using EDT, the test data volume and the test time of this chip are dramatically reduced, without decreasing the test coverage.
DFT design flow When running the memory testing, the data, addresses and control signals applied to RAM are generated by the BIST and then the BIST comparator checks data read from RAM and generates pass/fail signals.
In order to increasing the fault detection, we use two data backgrounds on this algorithm, all 0s and 01s interleave.
These scan chains are too long, which means the volume of scan test data will exceed the ATE’s data volume and the test time will be too long to be accepted.
By using EDT, the test data volume and the test time of this chip are dramatically reduced, without decreasing the test coverage.
Online since: May 2012
Authors: Yong Li, Ming Li Cao, Yan Li, Ming Fang
The data show the resource use and output emission during the cement manufacturing process in the plant in 2009.
The detail data are shown in Fig.1.
With those data, the indicators shown in Table 1 can be accounted.
With the formulae shown in Table 1 and data shown in Figur1, the main indicators can be accounted.
It is hard to collect data to compare with.
The detail data are shown in Fig.1.
With those data, the indicators shown in Table 1 can be accounted.
With the formulae shown in Table 1 and data shown in Figur1, the main indicators can be accounted.
It is hard to collect data to compare with.
Online since: November 2012
Authors: Halil Husain, Muhd Khairulzaman Abdul Kadir, Mohamad Fiteri Razali, Shahliza Azreen Sarmin, Mohamad Zikri Zainol, Zulhilmy Sahwee
The data acquisition, analysis and display are included in the system, which targeted different user categories [6].
To collect the voltage and current data, micro-controller was used to re-route the system power output to a dummy load.
Two important graphs generated from data collected were current vs. voltage and power vs. load.
Field Test Data.
Based on the gathered data, the maximum output power generated was 1.5W.
To collect the voltage and current data, micro-controller was used to re-route the system power output to a dummy load.
Two important graphs generated from data collected were current vs. voltage and power vs. load.
Field Test Data.
Based on the gathered data, the maximum output power generated was 1.5W.
Online since: January 2013
Authors: Thanasis Triantafillou, Catherine Corina G. Papanicolaou
The results indicate that the use of prefabricated TRC stay-in-place formwork elements is a promising solution for achieving reduction of the construction time, minimization of labor cost and defect-free finishing of external surfaces.
The guaranteed tensile strength of the fibers (as well as of the textile, when the nominal thickness is used) in each direction was taken from data sheets of the producer equal to 3500 MPa and 1750 MPa for carbon and E-glass fibers, respectively; the elastic modulus of the fibers was 220 GPa (carbon) and 72 GPa (E-glass).
Concrete crushing in the compression zone for specimens with steel-reinforced cast-in-situ parts was the cause for load-bearing capacity reduction.
The specimen’s progressively reducing stiffness was reflected upon the slope reduction of each consecutive linear branch.
Buckling of the vertical steel bars was observed at a later stage of the test (at 50% load carrying capacity reduction) causing textile rupture and total failure of the specimens (Fig. 14f).
The guaranteed tensile strength of the fibers (as well as of the textile, when the nominal thickness is used) in each direction was taken from data sheets of the producer equal to 3500 MPa and 1750 MPa for carbon and E-glass fibers, respectively; the elastic modulus of the fibers was 220 GPa (carbon) and 72 GPa (E-glass).
Concrete crushing in the compression zone for specimens with steel-reinforced cast-in-situ parts was the cause for load-bearing capacity reduction.
The specimen’s progressively reducing stiffness was reflected upon the slope reduction of each consecutive linear branch.
Buckling of the vertical steel bars was observed at a later stage of the test (at 50% load carrying capacity reduction) causing textile rupture and total failure of the specimens (Fig. 14f).
Online since: May 2016
Authors: Ho Sung Lee, Kyung Ju Min
CMH-17 provides standardized, statistically-based material property data by standardization of methodology used to develop, analyze, and publish property data for composite materials.
From this matrix it is possible to obtain statistically-based material properties that meet the most stringent handbook level of population sampling for B-values, data documentation and test method requirements.
In the data reduction method, the data from all environments, batches and panels can be utilized together to generate statistical information about the corresponding test.
This approach utilizes essentially small data sets to generate test condition statistics such as population variability and corresponding basis values to pool results for a specific failure mode across all environments.
Acknowledgement This study was supported by the Korean Ministry of Land, Infrastructure and Transport (MoLIT) through Air Transportation Advancement Program (ATAP) and the NCSRD (National Standard Reference Data Center) under Korean Agency for Technology and Standards.
From this matrix it is possible to obtain statistically-based material properties that meet the most stringent handbook level of population sampling for B-values, data documentation and test method requirements.
In the data reduction method, the data from all environments, batches and panels can be utilized together to generate statistical information about the corresponding test.
This approach utilizes essentially small data sets to generate test condition statistics such as population variability and corresponding basis values to pool results for a specific failure mode across all environments.
Acknowledgement This study was supported by the Korean Ministry of Land, Infrastructure and Transport (MoLIT) through Air Transportation Advancement Program (ATAP) and the NCSRD (National Standard Reference Data Center) under Korean Agency for Technology and Standards.
Online since: January 2013
Authors: Xiao Qiu Xu, Zhong Wen Wang, Rui Zhen Duan, Wan Li Xu, Fu Chun Tao
Hydraulic pump station and data collection as well as computer control system are shared by two subsystems.
Model verification based on MATLAB Mathematical model of electro-hydraulic servo control single-channel system of hydrostatic center frame oil film is of explicit model structure and order, but model parameters is time-variant and uncertain, model identification is based on servo system test and achieved the measured input and output data , the necessary data processing and computing are added, this is a process eliminating the mathematical model equal to measured system[5-6].
We can introduce the collected input and output data using system identification toolbox of MATLAB, the collected input and output data set is carried on filter , averaging and removing trend as well as other pretreatments, we can achieve input and output data[7-8].
We should carry on simulation analysis for identification model under off-line condition, then make experiment under different conditions, and carry on model identification through recording data.
Off-line simulation output of identification model and output data of identification experiment, output curve of identification model and output curve of identification experiment fit well.
Model verification based on MATLAB Mathematical model of electro-hydraulic servo control single-channel system of hydrostatic center frame oil film is of explicit model structure and order, but model parameters is time-variant and uncertain, model identification is based on servo system test and achieved the measured input and output data , the necessary data processing and computing are added, this is a process eliminating the mathematical model equal to measured system[5-6].
We can introduce the collected input and output data using system identification toolbox of MATLAB, the collected input and output data set is carried on filter , averaging and removing trend as well as other pretreatments, we can achieve input and output data[7-8].
We should carry on simulation analysis for identification model under off-line condition, then make experiment under different conditions, and carry on model identification through recording data.
Off-line simulation output of identification model and output data of identification experiment, output curve of identification model and output curve of identification experiment fit well.
Online since: August 2015
Authors: Titik Khawa Abdul Rahman, Zuhaila Mat Yasin, Zuhaina Zakaria
Hundreds of samples with different loading conditions were generated using multiobjective QIEP technique developed in [3]. 70% of the generated data were assigned for training process whereas the remaining 30% of the generated data were used as the testing process.
LS-SVM is reported to perform well when applied to data outside the training set.
If the selected value of sigma is too small, it will lead to over fitting phenomenon to sample data.
On the contrary, if the value of sigma is too big, then it will lead to under fitting phenomenon to sample data [7].
Generate training and testing data. 2.
LS-SVM is reported to perform well when applied to data outside the training set.
If the selected value of sigma is too small, it will lead to over fitting phenomenon to sample data.
On the contrary, if the value of sigma is too big, then it will lead to under fitting phenomenon to sample data [7].
Generate training and testing data. 2.