Sort by:
Publication Type:
Open access:
Publication Date:
Periodicals:
Search results
Online since: January 2015
Authors: Wei Yang, Qing Chen, Jian Hua Chen, Cong Han, Shu Guang Wang
Data analysis
1) Natural ventilation quantity and radon exhalation rate
Radon exhalation rate is calculated according to Eq.4.
This conclusion is consistent with the analysis results of the above measured data.
Effective data of a total of 56 groups is obtained in the experiment.
Radon Measurement and Practical Data [M].
Estimating large-scale fractured rock properties from radon data collected in a ventilated tunnel[A].
This conclusion is consistent with the analysis results of the above measured data.
Effective data of a total of 56 groups is obtained in the experiment.
Radon Measurement and Practical Data [M].
Estimating large-scale fractured rock properties from radon data collected in a ventilated tunnel[A].
Online since: March 2006
Authors: Lars Arnberg, Ragnvald H. Mathiesen
Dedicated processing software has
been developed to allow for quantitative extraction of data such as solid-liquid interface
morphology, local propagation velocities and constitutional gradients from the images.
The data collected also contain unprecedented in-situ observations on dendrite fragmentation.
Aggregate filtering, followed by linear contrast operations and noise reduction, gave a unique liquid data set that could be used for quantitative analysis, such as the contour maps in Fig.1.
From analysis of the full data sequence, it can be seen that due to liquid flow, substantially more solute piles up to the right than to the left, causing a large difference between the primary arms regarding their constitutional tip undercooling.
Such methods can be of importance in solidification science and provide data and observations on real systems and occurring phenomena.
The data collected also contain unprecedented in-situ observations on dendrite fragmentation.
Aggregate filtering, followed by linear contrast operations and noise reduction, gave a unique liquid data set that could be used for quantitative analysis, such as the contour maps in Fig.1.
From analysis of the full data sequence, it can be seen that due to liquid flow, substantially more solute piles up to the right than to the left, causing a large difference between the primary arms regarding their constitutional tip undercooling.
Such methods can be of importance in solidification science and provide data and observations on real systems and occurring phenomena.
Online since: February 2018
Authors: Rahul Khanna, Imran Sayeed, Rajendra Kr. Dubey
Even though the correlation was poor due to large scatter of data but it paved way for future development of this technique.
These equations are best fit curves for the data sets for which they have been developed.
However, by including more data points the existing relationship can be modified to suite the local conditions.
A total no of 94 data points including 10 from metavolcanics, 72 from dolomites and 12 from carbonaceous slates are used for this study.
The maximum, minimum, average and standard deviation of the final data set is shown in table 7.
These equations are best fit curves for the data sets for which they have been developed.
However, by including more data points the existing relationship can be modified to suite the local conditions.
A total no of 94 data points including 10 from metavolcanics, 72 from dolomites and 12 from carbonaceous slates are used for this study.
The maximum, minimum, average and standard deviation of the final data set is shown in table 7.
Online since: March 2015
Authors: Yuan Zeng, Dong Xu Lu, Yan Li Liu, Kai Hou, Chao Qin, Jian Qiu
In this aspect, data mining technology [9] can play a vital role.
By identifying and eliminating possible information risks with the help of data mining technology and data processing technology, the whole work of data analysis can be made simple and intuitive. 2) Information transmission stage.
The system interacts with EMS advanced application to get required data for calculation.
Data between the two parts is transferred by unified data format, as is shown in Fig.3(b).
Data mining technology for failure prognostic of avionics[J].
By identifying and eliminating possible information risks with the help of data mining technology and data processing technology, the whole work of data analysis can be made simple and intuitive. 2) Information transmission stage.
The system interacts with EMS advanced application to get required data for calculation.
Data between the two parts is transferred by unified data format, as is shown in Fig.3(b).
Data mining technology for failure prognostic of avionics[J].
Online since: July 2012
Authors: Jun Yang, Ga Zhao, Hong Wei Ding, Ping Ping Shu
Then do interleaving to prevent unexpected errors led to a continuous piece of data in error that can’t be corrected.
After that, the data must been serial/parallel transformed to reduce the rate of data flow and mapped into two parts, the real part and imaginary part, to improve the spectral efficiency of the channel.
The whole module is made of the memory controller, the butterfly unit, the data pipeline module, adder and the external RAM memory.
The data of each subchannel can be expressed as the following formula: D = (I + jQ) KMOD, it represents each subchannel data.
Peak-to-Average Power Ratio Reduction Techniques for OFDM Signals, IEEE Transactions, (2008), Vol.54(2), pp.257-268
After that, the data must been serial/parallel transformed to reduce the rate of data flow and mapped into two parts, the real part and imaginary part, to improve the spectral efficiency of the channel.
The whole module is made of the memory controller, the butterfly unit, the data pipeline module, adder and the external RAM memory.
The data of each subchannel can be expressed as the following formula: D = (I + jQ) KMOD, it represents each subchannel data.
Peak-to-Average Power Ratio Reduction Techniques for OFDM Signals, IEEE Transactions, (2008), Vol.54(2), pp.257-268
Online since: June 2025
Authors: Ya Xin Sun, Qing Ye
Some researchers employed data-sampling techniques.
A novel data balancing and boosting technique was developed, aiming to improve prediction performance in the face of imbalanced data [36].
Nonlinear data fusion over entity-relation graphs for drug-target interaction prediction.
An ensemble‑based drug–target interaction prediction approach using multiple feature information with data balancing.
MDTips: a multimodal-data-based drug-target interaction prediction system fusing knowledge, gene expression profile, and structural data.
A novel data balancing and boosting technique was developed, aiming to improve prediction performance in the face of imbalanced data [36].
Nonlinear data fusion over entity-relation graphs for drug-target interaction prediction.
An ensemble‑based drug–target interaction prediction approach using multiple feature information with data balancing.
MDTips: a multimodal-data-based drug-target interaction prediction system fusing knowledge, gene expression profile, and structural data.
Online since: August 2010
Authors: Mao Liang Wu, Lin Jun Hua
The scanning paths, generated from the parts during the data pretreatment, are necessary
to the rapid prototyping fabrication.
SLC data can be generated from various sources, either by conversion from CAD models or more directly from systems that produce data arranged in layers, such as CT-scanners. 2 Structure of the SLC file SLC file is divided into a header section, a 3D reserved section, a sample table section, and the contour data section.
The contour data section is a series of successive ascending Z cross-sections or layers with the accompanying contour data.
Each contour contains the minimum Z layer value, number of boundaries followed by the list of individual boundary data.
The boundary data contains the number of vertices (x, y) for that boundary, the number of gaps, and finally the list of floating point vertices points.
SLC data can be generated from various sources, either by conversion from CAD models or more directly from systems that produce data arranged in layers, such as CT-scanners. 2 Structure of the SLC file SLC file is divided into a header section, a 3D reserved section, a sample table section, and the contour data section.
The contour data section is a series of successive ascending Z cross-sections or layers with the accompanying contour data.
Each contour contains the minimum Z layer value, number of boundaries followed by the list of individual boundary data.
The boundary data contains the number of vertices (x, y) for that boundary, the number of gaps, and finally the list of floating point vertices points.
Online since: September 2015
Authors: Alexander Rodygin, Valentina Lyubchenko, Svetlana Rodygina
Rules of retail electricity markets oblige consumers with accurate planning of power consumption which requires using forecast methods with a desired accuracy when available data is inadequate (i.e. data with deficiencies and abnormal deviations).
Finding relationships in a large volume of data requires a lot of time and using unconventional algorithms.
To make the final model more reliable (if the data set allows), one more set is reserved – the testing set of observations.
Calculations of short-term load forecasting were carried out for a subject with variation of initial data.
The last column of Table 1 shows the model error determined with the control data subset which was calculated from all control data sets.
Finding relationships in a large volume of data requires a lot of time and using unconventional algorithms.
To make the final model more reliable (if the data set allows), one more set is reserved – the testing set of observations.
Calculations of short-term load forecasting were carried out for a subject with variation of initial data.
The last column of Table 1 shows the model error determined with the control data subset which was calculated from all control data sets.
Online since: June 2010
Authors: E Xu, Liang Shan Shao, Fang Yang, Tao Qu, Xin Cai Gu
Rough sets theory is an efficient mathematical tool to deal with the uncertain and
incomplete data, which don't need some transcendental knowledge or some accessional information
but just the data itself[7-10].
It can recover the missing data and reduce the redudant data.
Rough Sets-Theoretical Aspects of Reasoning about Data.
Pawlak, Rough Set Theory and Its Application to Data Analysis.
A New Method of Packing the Missing Data.
It can recover the missing data and reduce the redudant data.
Rough Sets-Theoretical Aspects of Reasoning about Data.
Pawlak, Rough Set Theory and Its Application to Data Analysis.
A New Method of Packing the Missing Data.
Online since: December 2024
Authors: Muhammad Hafiz Hassan, Jamaluddin Abdullah, Choo Then Xiang
These findings enhance aircraft production by promoting better hole quality with optimal drill bits, leading to material savings, lower rejection rates and cost reduction, thus boosting reliability and productivity.
Investigation of Damage Reduction When Dry-Drilling Aramid Fiber-Reinforced Plastics Based on a Three-Point Step Drill.
International Journal of Data and Network Science, 4(1), 43-56.
Investigation of Damage Reduction When Dry-Drilling Aramid Fiber-Reinforced Plastics Based on a Three-Point Step Drill.
International Journal of Data and Network Science, 4(1), 43-56.