Search Options

Sort by:

Sort search results by

Publication Type:

Publication Type filter

Open access:

Publication Date:

Periodicals:

Periodicals filter

Search results

Online since: September 2013
Authors: Ling Yu Zhang, Huai Lin Luo, Gang Xu, Yang Zhou
The three key technologies of reverse engineering are data acquisition,data processing and model reconstruction.
It can use 3D optical-scanner for data collection; scan the product from different angles and orientation to get the multi-view 3D point cloud data of bottle.Multi-view point cloud data of plaster model is shown in Figure 4. 3.4 Data Processing.
The pre-processing of data will provide convenience for further model reconstruction.Firstly, splice the points cloud data of different angles using of surface texture or feature points of measured objection through three-dimensional data-processing software to obtain the whole 3D points cloud data of product, then export data of different format, such as ASC,OBJ,WRL,STL,TXT and IGES etc.
Finally, import point cloud data to reverse software such as Imageware to make further processing, which is include point cloud filtering, data reduction and sampling, segmentation and merging, optimization, smoothing and so on.
The processed point cloud data of bottle body is shown in Figure 5.
Online since: May 2020
Authors: L.B. Leontiev, I.G. Khal'chenko, V.N. Makarov, Nikolai Shapkin
Minimal sizes of ordered (matrix) and disordered (voids) regions were calculated based on the obtained PAS data.
Based on the PAS data, in particular, the positron (τ2) and positronium (τ3) lifetimes, as well as the intensity of their annihilation (t2, t3) and annihilation rate constants (k2, k3), the authors of [5] found disordered and ordered volumes per 1 cm3 and the number of annihilations in 1 cm3, which allowed calculating the positron and positronium ‘trap’ volumes according to the formula: Vspec , Å3 (3) where Nps – the number of positronium Рs annihilations in 1 cm3; Ne – the number of positron е+ annihilations in 1 cm3.
Results and Discussion PMPS structures (Fig. 1) were determined based on element analysis data and spectral data: they were similar to the structures of layered phyllosilicates [6].
Analysis of the Table 2 data allowed suggesting that SPPV determined the intensity of electrostatic interaction between the spherical particles, while SPPM determined that of electrostatic interaction between the chains within the polymer particle.
This can be explained by a volume reduction of the voids, in which the lubricant can be contained, so under the friction conditions at boundary lubrication it prevented a seizure between the conjugated surfaces (reduced an adhesive component of the friction force).
Online since: December 2014
Authors: Shu Wang Yan, Zhi Fa Yu, Sheng Li Zhu
Through the organization and analysis of the data of several ultrahigh energy dynamic compaction projects, the opinion that the judgment method of vibration acceleration exists flaws is proposed.
Analysis Based on the data of table 1~4, vibration velocity ~ distance attenuation curve, and vibration acceleration ~ distance attenuation curve are drawn, see figure 1 and 2.
Affected Distance of Ultrahigh Energy Dynamic Compaction To research the law of the affected distance of high energy dynamic compaction, the author collects the data of several high energy dynamic compaction projects, and calculates the affected safety distance with the two methods.
Engineering Geological Condition Tamping Energy(kN.m) Safety Distance(m) Judgment Method 1 Some Engineering in Qingyang City, Gansu[4] Loess 15000 55 Velocity 2 Some Coastal Engineering[5] Backfill sand gravel 10000 58.9 Acceleration 3 Some Qingdao Engineering[6] / 8000 45 Velocity 4 Some Engineering in Qingyang City, Gansu Province[7] Silty clay 8000 42 Velocity 12000 34 Velocity 15000 49 Velocity 5 Some Dynamic Compaction Engineeirng[8] Sand gravel 10000 25 Acceleration 16000 27.7 Acceleration 6 ome engineering in Caofeidian, Tangshan[9] Backfill sand, fine sand 8000 118 Acceleration 12000 83.4 Acceleration 7 Some oil tank engineering in Jinzhou Backfill aggregate 12000 39.1 Velocity 12000 48.2 Acceleration 15000 59.2 Velocity 15000 51.0 Acceleration Based on the data of table 8, figure 3 is drawn.
(a) (b) (a)Vibration Velocity Judgment Method;(b)Vibration Acceleration Judgment Method Fig. 3 Tamping Energy ~ Distance Curve Obtained from Different Calculation Methods From figure 3, we can see that no good regularity is presented due to limited quantity of data collected and the influence of geological conditions and others on seismic wave when transmitting.
Online since: October 2014
Authors: Robert Andrzej Lis
DTs and WAMS Principles The DTs technique is an effective supervised data mining tool to solve the classification problems in a large database.
This approach served the industry well but lacks the ability of observing measurements across the whole system because the data was not time synchronized.
Entropy is a measure that enables to compare attributes with each other and then be able to decide to put ones that split the data more purely higher up the tree.
This probability measure gives us an indication of how uncertain we are about the data.
The approach (Fig. 2) to constructing decision trees usually involves using greedy heuristics, Entropy reduction (7), that can over-fit the training data and can lead to poor accuracy in future predictions.
Online since: July 2011
Authors: Phani Srikanth, Amarjot Singh, Devinder Kumar, Aditya Nagrare, Vivek Angoth
This trend motivated the development in machine intelligence especially in the field of medical data analysis.
Medical data analysis has been applied to a number of applications such as predicting the state of different diseases, open source software for medical data analysis and also in the pharmaceutical clinical trials etc.
SVM and Boosting have been the favourite methodologies applied for data analysis.
The training data consists of objects categorized into classes used to construct the SVM while the testing data is further used to classify the unknown input.
Here, we make an assumption that the training data is distributed uniformly over.
Online since: November 2010
Authors: Xin Rong Liu, Guang Yang
And there is an initial phase again, and knowledge management of an enterprise of coal of our country doesn't establish to be still complete and makes the resource control of data of an enterprise the structure of the sharing and the administrative function of the central data.
Data source is the basis of the knowledge management system, including the production management information within enterprise and external data and offline data.
Internal production management information stored in corporate operations, including database data in a variety of business and office automation systems include various types of document data.
To mine safety data source example, gas, pressure and the roof, coal dust, water, fire and other natural disasters, factors affecting the production of coal mine safety and the most difficult to control the most important factor [5], it factors in these major disasters the entering point, analysis of its impact on the various data sources of knowledge for the establishment of mine safety management system provides a data guarantee.
If the records of security incidents, analysis, statistics; underground power supply monitoring system data; safety inspectors data; mine environmental monitoring system data.
Online since: September 2012
Authors: Fu Cheng You, Dong Sheng Zhang
Reduction of capital investment: SaaS is a model that based on Internet service.
Then the edited contents and other data can be store into the database by the JDO technology.
Data Persistence module: The Data Persistence module uses Java data object (JDO) technology.
Through JDO technology, this module can be used to store the data or, through GQL (Google Query Language), query data.
The data may include document information, document contents, user information, workflow control data or other related data.
Online since: July 2013
Authors: Da Zhang, Le Wen Yu, Yuan Sheng Zhang
The sub-pixel of light stripes center is found with high precision and high robust. 2 The principle of measurement The structure of triangular laser method is easy and fast responded and easy to fulfill online data process, now it is be used more and more in industry measure field[6-8].
(1) Fig.1 The principle of laser triangulation measurement Where a is the distance from the intersection of optical axis of laser beam and optical axis of receiving lens to receiving lens, b is the distance between receiving lens and imaging plane, is the angle between optical axis of laser beam and optical axis of receiving lens, Y is the deviation of height y in CCD. 3 The system components This paper puts forward a three-dimensional measurement system based on line laser, the line laser projection along measured surface of objects; and CCD camera collecting data of laser lines, and calculating the depth information to reconstruction of three-dimensional shape.
Fig.2 The diagram of measurement system 4 The algorithm of extraction the light stripes center It is very important to accurately acquire the light stripes center in measurement systems, the common method including the gray threshold method, extremum method, digital reduction shadow method, and so on.
Firstly, the calibration block is putted on a moving datum plane, which size of 100mm×40mm×10mm.
Online since: August 2011
Authors: Jin Hua Ju, Hua Wang, Ji Wen Xu
The electrical properties of 0.01 mol%V2O5-doped ZnO varistor ceramics were measured by a DC parameter instrument, and after computing the test datas, the nonlinear coefficient is 25, the leakage current density is 0.02 µA/mm2 and the voltage gradient is 31.1 V/mm.
The electrical properties datas are summarized in Table 1.
The datas of the curve in the Fig.3 are consistent with the electrical properties datas in Table 1.
Moreover, a reduction of V5+ to V4+ would occur at high temperatures, and the V4+ ion has an ionic radius of ~ 0.61 Ǻ, which is close to that of Zn2+.
On the basis of the grain boundary defect model, an analysis procedure using the E-J experimental datas had been satisfactorily applied to calculate the grain boundary parameters.
Online since: June 2010
Authors: Jing Tao Han, Jing Liu, Guo Liang Xie, Pei Jie Yan
Isothermal compression tests of the two component materials were carried out using Gleeble 1500, in order to obtain their high temperature strength data, as shown in Fig. 2.
The temperatures and pressures data of 100 billets were then processed by Newton Interpolation, as shown in Fig. 3.
By combination of the data in Tables 2 and 3 and Equation (9), the extrusion pressure will be estimated approximately.
Thus, the calculation based on that assumption shows a good agreement with the data obtained from industrial trials, with the error less than 10 %.
Thus, the calculation based on that assumption shows a good agreement with the data obtained from industrial trials, with the error less than 10%.
Showing 21551 to 21560 of 40694 items