Advanced Materials Research
Vol. 414
Vol. 414
Advanced Materials Research
Vol. 413
Vol. 413
Advanced Materials Research
Vol. 412
Vol. 412
Advanced Materials Research
Vol. 411
Vol. 411
Advanced Materials Research
Vol. 410
Vol. 410
Advanced Materials Research
Vol. 409
Vol. 409
Advanced Materials Research
Vols. 403-408
Vols. 403-408
Advanced Materials Research
Vol. 402
Vol. 402
Advanced Materials Research
Vols. 399-401
Vols. 399-401
Advanced Materials Research
Vols. 396-398
Vols. 396-398
Advanced Materials Research
Vols. 393-395
Vols. 393-395
Advanced Materials Research
Vols. 391-392
Vols. 391-392
Advanced Materials Research
Vols. 383-390
Vols. 383-390
Advanced Materials Research Vols. 403-408
Paper Title Page
Abstract: The main aim of this paper is to analyze the various recruitment decision making policies using granular computing based on rough set perspective and fuzzy distance approach for recruiting a candidate in any organization. An information table has been presented in this article which consists of various factors like appearance, qualification, experience and communication skills to evaluate a candidate for recruitment. Depending upon the granularity of knowledge obtained from the information table, the concept of rough set has been applied to generate fuzzy decision rules which in turns form the eligibility criteria for the candidates appearing in the interview for recruitment. The experts in the interview committee have stated their opinion about the candidates linguistically. A relationship has been established between the eligibility criteria required for the job and the expert opinions about the candidates appeared in the interview using a fuzzy subset representation. The index of fuzziness of various experts’ opinion are measured and compared. A candidate with higher grade of merit has been selected using fuzzy distance approach.
802
Abstract: In this paper, a general framework is given for risk assessment. As a case study, leakage risk in gas transmission pipelines has been considered. To estimate the risk, two main terms are defined: "severity" and "likelihood of occurrence", both of which are associated with uncertainty. By use of "fuzzy set theory", these uncertainties have been modeled. In order to estimate the risk for the whole pipeline, it is divided into a number of segments. To aggregate the calculated segment's risk values, the new combination rule is proposed. This is necessary because in cases of high conflict between the evidences source and their null intersection, this theory does not provide a sensible outcome. To remedy these problems, in this paper, a new method for data aggregation has been proposed. Simulation results indicate the better performance of the proposed method compared with the original combination rule of Dempster-Shafer theory in risk assessment.
810
Abstract: Image processing techniques have been used over the years to convert printed material into electronic form. In our work we exploit the fact that some applications may find such conversions redundant and yet satisfactorily meet the demands of the end user. Using the horizontal and vertical white-spaces present in any document, independent regions of text, pictures, tables etc. could be identified. Inherent characteristic disparities were then used to distinguish pictures from text, and section-headings from the explanations that follow them. A table of contents, showing the heading and the associated page number, was generated and displayed on the browser. Each heading was hyperlinked to the corresponding page of the original document. HTML code was written dynamically, using file handling techniques in MATLAB to accommodate for variable number of headings obtained for different documents and also from different pages of a single document. The platform thus developed was tested on various languages and it was verified that the method implemented was language independent.
817
Abstract: This paper investigates the effectiveness of using lampposts, which are commonly found in University campus environments with high frequency, as landmarks in a 2D LIDAR based Simultaneous Localization and Mapping (SLAM) framework. Lampposts offer a number of benefits compared to other forms of landmarks. Their unique spatial signature makes it possible to design effective algorithms to extract them. They have a very small spatial size. Their use removes the challenge of determining a corresponding location between difference views. This represents a major challenge if larger objects are used as landmarks. The proposed SLAM algorithm contains three stages. Firstly LIDAR segmentation is performed. Next each object is input to a binary classifier which determines objects with a high probability of corresponding to lampposts. Finally these extracted lampposts are input to an Iterative Closest Point (ICP) based SLAM algorithm. The ICP algorithm used is an extension of the traditional ICP algorithm and filters associations due to noise. Results achieved by the proposed system were very positive. An accurate map of a university’s lampposts was created and localization, when compared to GPS ground-truth, was very accurate.
823
Abstract: Breast cancer, the most common cancer in women, is one of the major causes for the increase in mortality among women especially in developed countries. Doppler ultrasound is an important noninvasive diagnostic tool for identifying breast malignancies. We present a novel technique for segmenting blood vessels in ultrasound color Doppler images based on image processing techniques. The proposed technique decomposes a complex object representing either two or more vessels artificially linked together or a main vessel with its branches. We segment out the blood vessels in ultrasound color Doppler images and count the number of vessels to detect breast malignancy. Matlab has been used to simulate the algorithm and the results obtained are presented in this paper. The result represents distinct vessels that can be used in further object recognition and quantification applications.
830
Abstract: In this paper we are proposing a new Image steganography technique for secure communication between sender and receiver. At the sender we follow two steps. In the first step we encrypt the secret information by blowfish algorithm and in second step we embed the cipher text in LSB minus one and LSB (least significant bit) locations of some of the selected pixels (bytes) of the carrier image. One pixel is 8 bits in 8-bit gray scale. The selection of the pixels is done by a dynamic evaluation function. Depending on the cipher text bits, the dynamic evaluation function decides on which pixels the different cipher text bits are to be embedded. At the receiver also two steps are followed, first the cipher bits are retrieved from the image from the said locations and then it is decrypted by using the blowfish algorithm to get the secret information. As the embedding byte locations are decided based on bits of the cipher text, so it is dynamic steganography. This approach provides two levels of security, one at the cryptography level and the other at the steganography level. The proposed technique is experimented through a large number of experiments.
835
Abstract: In this paper we propose a technique for secure communication between sender and receiver. We use both cryptography and steganography. We take image as the carrier to use steganography. We have extended the existing hill cipher to increase its robustness and used it as our cryptography algorithm. By using this extended hill cipher (a new block cipher) which uses a 128 bit key, we encrypt the secret message. Then the cipher text of the secret message is embedded into the carrier image in 6th, 7th and 8th bit locations of some of the selected pixels (bytes). The 8th bit in a pixel (byte) is called as the least significant bit (LSB). The pixel selection is done depending on the bit pattern of the cipher text. So for different messages the embedding pixels will be different. That means to know the pixels of the image where the cipher text is embedded we should know the cipher text bits. Thus it becomes a stronger steganography. As the pixels where we embed are chosen during the run time of the algorithm, so we say that it is dynamic steganography. After embedding the resultant image will be sent to the receiver, the receiver will apply the reverse process what the sender has done and get the secret message.
842
Abstract: An interesting problem in Biometrics Research is human identification by reliable extraction of anatomical and behavioral patterns of a person from his manner of walking. Till now the identification through the said characteristics has been addressed by mostly using marker-based methods. These methods have limited applications in the area of security, where a person must be ‘marked’ for correct identification. To mitigate this limitation, we propose a marker-less, model- based approach (ST-Gait) for human identification using gait. The proposed method is simple but effective, and involves low computational complexity. It is validated on a well- known benchmark database (gait database A of CASIA). The encouraging experimental results show that the technique achieves an accuracy of 90% and can be a promising tool for human identification in the area of security.
850
Abstract: This paper presents cascading neural networks using adaptive sigmoidal function (CNNASF). The proposed algorithm emphasizes on architectural adaptation and functional adaptation during training. This algorithm is a constructive approach to building cascading architecture dynamically. The activation functions used at the hidden layers’ node are belonging to the well-defined sigmoidal class and adapted during training. The algorithm determines not only optimum number of hidden layers’ node, as also optimum sigmoidal function for them. One simple variant derived from CNNASF is where the sigmoid function used at the hidden layers’ node is fixed. Both the variants are compared to each other on five regression functions. Simulation results reveal that adaptive sigmoidal function presents several advantages over traditional fixed sigmoid function, resulting in increased flexibility, smoother learning, better convergence and better generalization performance.
858
Abstract: This paper is a comparative study of image denoising using previously known wavelet transform and new type of wavelet transform, namely, Diversity enhanced discrete wavelet transform. The Discrete Wavelet Transform (DWT) has two parameters: the mother wavelet and the number of iterations. For every noisy image, there is a best pair of parameters for which we get maximum output Peak Signal to Noise Ratio, PSNR. As the denoising algorithms are sensitive to the parameters of the wavelet transform used, in this paper comparison of DEDWT to DWT has been presented. The diversity is enhanced by computing wavelet transforms with different parameters. After the filtering of each detail coefficient, the corresponding wavelet transforms are inverted and the estimated image, having a higher PSNR, is extracted. To benchmark against the best possible denoising method three thresholding techniques have been compared. In this paper we have presented a more practical, implementation oriented work.
866