Sort by:
Publication Type:
Open access:
Publication Date:
Periodicals:
Search results
Online since: July 2007
Authors: Eberhard Kunke, Hans Bräunlich, Reimund Neugebauer, Angela Göschel
Batch Size Reduction
Germany continues to set its focus on cutting-edge technology and premium quality.
The consequence of this is a trend towards a reduction in batch sizes.
• Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization Technological optimization • Die design • Die material • contour generation Component concept Process chain selection Process development Die design Die material Die manufacturing Die assembly Try out Optimization of process chain depending on geometry, material and quantities !
• Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization • Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization Technological optimization • Die design • Die material • contour generation Technological optimization • Die design • Die material • contour generation Component concept Process chain selection Process development Die design Die material Die manufacturing Die assembly Try out Figure 4 - Reduction potentials for die costs.
However, if n'min > nmin (2) then major reductions in tooling costs can still be achieved by reducing the degree of shape memory.
The consequence of this is a trend towards a reduction in batch sizes.
• Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization Technological optimization • Die design • Die material • contour generation Component concept Process chain selection Process development Die design Die material Die manufacturing Die assembly Try out Optimization of process chain depending on geometry, material and quantities !
• Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization • Data consistency • Planning • Feedback • Communication • Reverse Engineering Operational optimization Technological optimization • Die design • Die material • contour generation Technological optimization • Die design • Die material • contour generation Component concept Process chain selection Process development Die design Die material Die manufacturing Die assembly Try out Figure 4 - Reduction potentials for die costs.
However, if n'min > nmin (2) then major reductions in tooling costs can still be achieved by reducing the degree of shape memory.
Online since: August 2013
Authors: Guang Tian, Xin Jie Shao, Jin Hua Liu, Yun Guang Qi
Introduction
Fault diagnosis patterns recognition[1,2] method classify with measure data and needn’t systemic mathematics model, so it is called fault diagnosis based on data.
In 1992, French scholar Dubois and Praded[3] put forward fuzzy-rough sets which combine the advantages of fuzzy sets and rough sets theory, use fuzzy sets replace the precise sets, and get the reduction with lesser information loss, holding the classification capability of original data sets furthest.
Dynamic Clustering and Discrete The essential of continuous data discrete is clustering, there are many kinds of continuous data discrete method, but they don’t consider the resolution of inner set element.
Thus get conditon attribute reduction , the result is the same as reduction method fo common rough sets.
Using the above rules, we can do patterns recognition to measuring data with 4 kinds of working conditions, to compare the effect, we do patterns recognition with common rough sets method to aboved data, shown in Table 5, we can find that the indentification precision of fuzzy rough sets method is much better than common rough sets method.
In 1992, French scholar Dubois and Praded[3] put forward fuzzy-rough sets which combine the advantages of fuzzy sets and rough sets theory, use fuzzy sets replace the precise sets, and get the reduction with lesser information loss, holding the classification capability of original data sets furthest.
Dynamic Clustering and Discrete The essential of continuous data discrete is clustering, there are many kinds of continuous data discrete method, but they don’t consider the resolution of inner set element.
Thus get conditon attribute reduction , the result is the same as reduction method fo common rough sets.
Using the above rules, we can do patterns recognition to measuring data with 4 kinds of working conditions, to compare the effect, we do patterns recognition with common rough sets method to aboved data, shown in Table 5, we can find that the indentification precision of fuzzy rough sets method is much better than common rough sets method.
Online since: June 2014
Authors: Zhen Fei Song, Ming Xie
A projection based model order reduction method is then utilized to achieve a compact macro-model.
The scheme and computational methodology of a developed calculation tool are introduced, together with some practical calibration data which indicates the effectiveness of the proposed methods.
The content is organized as follows, the overview of ECSM is given firstly, with intention of method disadvantage representation; then the scheme and computational methodology of a developed calculation tool is given in detail, together with some practical antenna calibration data which can indicate the effectiveness of the proposed methods; finally, it ends with conclusions.
Model Order Reduction.
Fig.7 (a) shows the monopole model after discretization; while Fig.7 (b) gives the calculation results, together with data by the analytical formula (1).
The scheme and computational methodology of a developed calculation tool are introduced, together with some practical calibration data which indicates the effectiveness of the proposed methods.
The content is organized as follows, the overview of ECSM is given firstly, with intention of method disadvantage representation; then the scheme and computational methodology of a developed calculation tool is given in detail, together with some practical antenna calibration data which can indicate the effectiveness of the proposed methods; finally, it ends with conclusions.
Model Order Reduction.
Fig.7 (a) shows the monopole model after discretization; while Fig.7 (b) gives the calculation results, together with data by the analytical formula (1).
Online since: May 2014
Authors: Xiang Ming Wen, Zhi Qun Hu, Zhen Hai Zhang, Yang Chun Li, Zhao Ming Lu, Xiao Kui Chen
For operators, voice applications and their own data applicationsare the main source of revenue.
The asymmetry between signaling traffic and the data traffic is the distinctive feature of the always-on applications.As the traditional mobile communication service, voice applications must be guaranteed at any time.
The other applications take up a lot of data traffic and the corresponding signaling resources.
The eNodeB obtains the resources usage of each type of applications over a period of time, and then sends the data to the strategy controller.
The QoS negotiation consists of the QoS reduction of the new application and the QoS reduction of the existing applications which belong to Type I and Type III.
The asymmetry between signaling traffic and the data traffic is the distinctive feature of the always-on applications.As the traditional mobile communication service, voice applications must be guaranteed at any time.
The other applications take up a lot of data traffic and the corresponding signaling resources.
The eNodeB obtains the resources usage of each type of applications over a period of time, and then sends the data to the strategy controller.
The QoS negotiation consists of the QoS reduction of the new application and the QoS reduction of the existing applications which belong to Type I and Type III.
Online since: March 2011
Authors: Li Di Wang, Jiang Feng Tang, Jun Sheng Shi
Furthermore, experiment results based on the measurement data show that the wavelet denoising method is more efficient in some aspects such as the accuracy of identification and SSE.
In the load modeling processing, due to the natural load is changeable at anytime and the measurement instruments have measure error as well as the data transforming and compressing, the measurement data inevitably include disturbance and noise.
Preprocess the original measurement data, including voltage signal V(t), active signal P(t).
After preprocessing of the signal, make difference of the voltage data to confirm the step point of the voltage variable before select a suitable threshold value T.
A noise reduction technique for on-line detection and location of partial discharges in high voltage cable networks.
In the load modeling processing, due to the natural load is changeable at anytime and the measurement instruments have measure error as well as the data transforming and compressing, the measurement data inevitably include disturbance and noise.
Preprocess the original measurement data, including voltage signal V(t), active signal P(t).
After preprocessing of the signal, make difference of the voltage data to confirm the step point of the voltage variable before select a suitable threshold value T.
A noise reduction technique for on-line detection and location of partial discharges in high voltage cable networks.
Online since: June 2012
Authors: Yu Sun, Zhong Hui Lin, Ru Bo Zhang
They are beneficial for data representation and classification and have been widely used in recent years.
These improved subspace algorithms not only describe the nonlinear structure of the data well but also can project new data points directly.
Supposing represents the high-dimensional space of the original data, is a low-dimensional manifold embedded in , and , for a low-dimensional mapping function , to maintain the intrisic geometrical structure of the data manifold, find a function that meets the following condition
The IsoProjection algorithm is able to describe the manifold structure of the data and extract useful information from the data for classification, so its recognition performance is better than PCA and LDA.
Mikhail: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation.
These improved subspace algorithms not only describe the nonlinear structure of the data well but also can project new data points directly.
Supposing represents the high-dimensional space of the original data, is a low-dimensional manifold embedded in , and , for a low-dimensional mapping function , to maintain the intrisic geometrical structure of the data manifold, find a function that meets the following condition
The IsoProjection algorithm is able to describe the manifold structure of the data and extract useful information from the data for classification, so its recognition performance is better than PCA and LDA.
Mikhail: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation.
Online since: August 2016
Authors: M. Ganesan, S. Sendilvelan
It has been found that 13% reduction in CO light-off time was achieved with pre-catalyst (40%vol.), 50% reduction with pre-catalyst (20%vol.) and 66% reduction with hot air injector system, when compared to TCCS.
Also 14% reduction in HC light-off time was achieved with pre-catalyst (40%vol.), 43% reduction with pre-catalyst (20%vol.) and 63% reduction with hot air injection system, when compared to TCCS.
The temperatures are measured using K type thermocouples, stored in data logger and sent to ECS for actuating the motor shafts to open and close the valves.
The temperature of the catalyst was measured at the following points using K type thermocouple and data logger.
It has been found that there is 66% reduction in CO light-off time and 63% reduction in HC light-off time when compared to TCCS without pre-catalyst at Position 1.
Also 14% reduction in HC light-off time was achieved with pre-catalyst (40%vol.), 43% reduction with pre-catalyst (20%vol.) and 63% reduction with hot air injection system, when compared to TCCS.
The temperatures are measured using K type thermocouples, stored in data logger and sent to ECS for actuating the motor shafts to open and close the valves.
The temperature of the catalyst was measured at the following points using K type thermocouple and data logger.
It has been found that there is 66% reduction in CO light-off time and 63% reduction in HC light-off time when compared to TCCS without pre-catalyst at Position 1.
Online since: September 2013
Authors: Xiu Fen Fu, Cui Cui Ge
Mining Closed Weighed Frequent Patterns from A Sliding Window over Data Stream
Ge Cuicui, Fu Xiufen
Faculty of Computer, Guangdong University of Technology, Guangzhou,China
gecuicui1988@163.com; Fuxf@gdut.edu.cn
Keywords: Sliding window; Data stream; Closed weighted frequent pattern; DS_CRWF; Data mining
Abstract.
Introduction Data stream is a sequence of massive data elements which is continuously generated and unbounded.
In sliding window, new data comes and old data eliminate continually, the data in sliding window not only growth but also reduction, so, it is more difficult to mine frequent patterns in sliding window.
The adopted data stream is the customer shopping data generated by the IBM synthetic data generator.
The specified data is T10I6D1000K, each basic window contains 50000 transactions over data stream.
Introduction Data stream is a sequence of massive data elements which is continuously generated and unbounded.
In sliding window, new data comes and old data eliminate continually, the data in sliding window not only growth but also reduction, so, it is more difficult to mine frequent patterns in sliding window.
The adopted data stream is the customer shopping data generated by the IBM synthetic data generator.
The specified data is T10I6D1000K, each basic window contains 50000 transactions over data stream.
Online since: April 2014
Authors: Chun Lin Li, Waleed Al-Museelem
Initially, the owners of data maintain message authentication codes for data files being outsourced and thus the owners of the data can verify the integrity of the data through recalculation of the message authentication code of the data file received.
The algorithms for data insertion and manipulation insert and manipulate data efficiently.
The confidentiality of data is based on the hiding of sensitive data and information.
Data integrity must ensure that data is not subject to damage, modification or deletion.
Users should also access data and the cloud should meet the data capacity.
The algorithms for data insertion and manipulation insert and manipulate data efficiently.
The confidentiality of data is based on the hiding of sensitive data and information.
Data integrity must ensure that data is not subject to damage, modification or deletion.
Users should also access data and the cloud should meet the data capacity.
Online since: November 2014
Authors: Shi Qing Dou, Xiao Yu Zhang
Data simplification is an important factor of the spatial data generalization, which is an effective way to improve rendering speed.
Introduction When the large-scale and high precision data is applied to small-scale and low precision, it will have the unnecessary data redundancy.
How to eliminate these extra data according to needs is a primary task of GIS spatial data processing.
Data simplification is an important factor of the spatial data generalization.
Faust, Rendering Vector Data over Global, Multi-resolution 3D Terrain, Symposium on Data Visualization, 40( 2003) 213-222
Introduction When the large-scale and high precision data is applied to small-scale and low precision, it will have the unnecessary data redundancy.
How to eliminate these extra data according to needs is a primary task of GIS spatial data processing.
Data simplification is an important factor of the spatial data generalization.
Faust, Rendering Vector Data over Global, Multi-resolution 3D Terrain, Symposium on Data Visualization, 40( 2003) 213-222