46 results on '"V., Saravanan"'
Search Results
2. Solar Photovoltaic Energy Forecasting Using Improved Ensemble Method For Micro-grid Energy Management
- Author
-
Sehani Siriwardana, Thakshila Nishshanka, Adeesha Peiris, M. A. K. S. Boralessa, K. T. M. U. Hemapala, and V. Saravanan
- Published
- 2022
3. A Reliable and Fast Automatic Combination of Deep Features and Species Categorization Using Unified Ensemble Layer
- Author
-
S Menaka, B Shivani, S Malavikka, B S Ganesh, and V Saravanan
- Published
- 2022
4. OWASP Attack Prevention
- Author
-
B. Kiruba, V. Saravanan, T. Vasanth, and B.K. Yogeshwar
- Published
- 2022
5. The Role of Parallel Computing Towards Implementation of Enhanced and Effective Industrial Internet of Things (IOT) Through Manova Approach
- Author
-
Milad Mohseni, Bestoon Abdulmaged Othman, Preeti Raturi, A. B. Mishra, S.Janu Priya, and V. Saravanan
- Published
- 2022
6. Accelerated testing and results of underwater electric thrusters for mini observation class ROVs
- Author
-
Ravichandran, Santhosh, primary, V, Saravanan, additional, Natu, Aditya, additional, Arunan, Sreeram, additional, Raj, Govind, additional, Upadhyay, Vineet, additional, Singh, Harish, additional, S, Sarvesh, additional, and Agarwal, Shagun, additional
- Published
- 2022
- Full Text
- View/download PDF
7. Z-Source Inverter based reconfigurable architecture for solar photovoltaic microgrid
- Author
-
V. Saravanan, W. D. Prasad, Ktmu Hemapala, M. Arumugam, M. K. Perera, and K. A. H. Lakshika
- Subjects
Power Architecture ,Computer science ,business.industry ,Photovoltaic system ,Electrical engineering ,Inverter ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Microgrid ,AC power ,business ,Maximum power point tracking ,Z-source inverter ,Power (physics) - Abstract
This paper proposes a reconfigurable architecture for residential microgrid (MG). The distinct feature of the residential MG is the power architecture which is developed using Z-source inverter (ZSI) for solar photovoltaic (PV) system and it can be reconfigured to current controlling mode and voltage-frequency controlling mode as well as reactive power controlling mode when solar system is idle. Hence, it improves the utilization factor of solar PV system and contributes to maintain the power quality in distribution feeder, while providing an uninterrupted power supply to the customer. The proposed architecture is developed in four stages. As the first stage, current controlling mode with MPPT is developed in MATLAB/Simulink environment and results are discussed in this paper.
- Published
- 2020
8. Demand Pattern of Decentralized Fixed Speed Compressor Air-Conditioning Systems and Demand Response Potential
- Author
-
V. Saravanan, Ktmu Hemapala, and M. A. Kalhan S. Boralessa
- Subjects
Demand reduction ,Computer science ,business.industry ,Building model ,Energy consumption ,Thermostat ,Automotive engineering ,law.invention ,Demand response ,law ,Air conditioning ,Microgrid ,business ,Gas compressor - Abstract
this research focus on obtaining the demand pattern of fixed speed compressor air conditioning systems (FSCAC) and analyze their demand response potential. First an experiment is done to get the demand pattern of a FSCAC and then a model is developed to simulate the power consumption using available thermal model of a house in MATLAB/Simulink. Using this model, an aggregated model of a building is created to simulate the real world power consumption. Building model is used to show the demand reduction capability by increasing thermostat temperature set points. 50%-60% reduction of running average demand of the building is obtained by 4 °C increase of thermostat set point. The thermal model of a house is then altered to get the energy consumption of a FSCAC. It is shown that 50% energy reduction in 24 hours of operation is possible by 4 °C increase of thermostat set point.
- Published
- 2019
9. Developing Multi-Agent Based Micro-Grid Management System in JADE
- Author
-
W. D. A. S. Wijayapala, Ktmu Hemapala, H.V.V. Priyadarshana, M. A. Kalhan S. Boralessa, and V. Saravanan
- Subjects
Computer science ,Energy management ,business.industry ,Distributed computing ,Photovoltaic system ,JADE (programming language) ,ComputingMethodologies_ARTIFICIALINTELLIGENCE ,Energy management system ,Smart grid ,Distributed generation ,Management system ,Diesel generator ,business ,computer ,computer.programming_language - Abstract
This paper presents an implementation of a Multi-Agent based Energy Management System for a micro grid with JADE (Java Agent Development Framework). The MAS is applied for a micro grid consisting of different distributed energy sources such as solar PV system, wind power system, diesel generator system, storage system, and critical and noncritical loads. Different agents are developed on JADE framework and they are given responsibilities of relevant DES’s (Distributed Energy Source) and loads. A runtime environment for Agents are created and a dynamic simulation model developed through JADE considering the intermittent qualities of renewable energy sources. The case studies presented on this paper are modeled on JADE platform. Developing MAS in JADE runtime environment using AOP (Agent-Oriented Programming) helps agents to operate with all their autonomous, rational, reactive, and proactive qualities. The use of MAS concept in micro grids improves its efficiency in various aspects. The MAS based micro grid management system implemented in JADE platform and can be used to carry out various simulations to study about agent behaviors in different environments and different system objectives. The main purpose of the paper is to prove the possibility of using multi-agent concept in micro grid energy management systems.
- Published
- 2019
10. Coordination of wind and PSP in India
- Author
-
N. Subhashini and V. Saravanan
- Subjects
Wind power ,Power station ,business.industry ,Renewable energy ,Power (physics) ,Turn off ,Hydroelectricity ,Low load ,Environmental science ,MATLAB ,business ,computer ,computer.programming_language ,Marine engineering - Abstract
Wind Energy has become an astonishing power in the world. Nowadays people are realizing the importance of renewable energies, especially wind energy. Though we are extracting power from the wind, it is not extracted completely because of the load variation. Many wind turbines are made to turn off in case of high wind or low load. The solution for this problem is to coordinate wind plant with the hydro power plant especially pumped storage power plant. India has a good potential of wind energy remarkably in Tamilnadu. The wind farm is coordinated with the Pumped storage hydro power plant at Tehri in Uttarkhand, India. It is done with the help of PMU. The coordination is done with the help of MATLAB/SIMULINK.
- Published
- 2017
11. Intelligent controller implementation for a bioprocess
- Author
-
S. Nagammai and V. Saravanan
- Subjects
Chemical process ,Computer science ,Control theory ,Genetic algorithm ,Stability (learning theory) ,Evolutionary algorithm ,Process control ,PID controller ,Linear-quadratic-Gaussian control - Abstract
Proportional-integral-derivative (PID) controller act as efficient controllers for controlling all kinds of industrial process with best performance. A number of chemical processes in the industries are controlled using PID controllers. However, the industrial processes are generally more complicate and nonlinear which can yields inferior performance when controlled by conventional PID controllers. In order to enhance the performance of that kind of process optimal controllers are needed for best control strategy. Genetic algorithm is a type of evolutionary algorithm that is widely accessed in this respect. In this paper Genetic Algorithm is proposed to enhance the performance of Bioprocesses. The working methodology and efficiency of the proposed method are compared with that of traditional methods namely conventional PID controller and LQG controller. The obtained results shows that GA based controllers enhance the performance of the process with best stability.
- Published
- 2017
12. Influence of non uniform heating and porous medium on performance of micro channel heat sink
- Author
-
C.K. Umesh and V. Saravanan
- Subjects
Materials science ,Convective heat transfer ,020209 energy ,Mechanical engineering ,02 engineering and technology ,Heat transfer coefficient ,Mechanics ,Heat sink ,Annular fin ,Fin (extended surface) ,Heat transfer ,0202 electrical engineering, electronic engineering, information engineering ,Micro heat exchanger ,Plate fin heat exchanger - Abstract
The increase in requirement for heat dissipation in thermal management application has led to the development of new techniques. Heat sink with micro pin fin is one of the effective techniques which have attracted the researcher in past two decades. In the present work the performance of un finned Micro channel heat sink (MCHS), heat sink with circular pin fin (CPF) and heat sink with square pin fin (SPF) are numerically analyzed to access the hydrodynamic and heat transfer behavior for Reynolds number ranging between 100 to 900 using water as coolant. Three different cases of heat flux namely uniform, linear and sinusoidal were examined using three dimensional conjugate heat transfer model. According to the results obtained from the present work heat sink with square pin fin and uniform heat flux has higher heat transfer co efficient and pressure drop. Furthermore the influence of porous medium on the performance of heat sink were studied. Results reveal that hydrodynamic and heat transfer characteristics are greatly influenced by size and location of porous medium.
- Published
- 2017
13. Design of reconfigurable monopole antenna using pin Diodes for implants
- Author
-
V. Saravanan and A.P. Merrin
- Subjects
Physics ,Reconfigurable antenna ,Coaxial antenna ,business.industry ,Antenna measurement ,Astrophysics::Instrumentation and Methods for Astrophysics ,Electrical engineering ,020206 networking & telecommunications ,Data_CODINGANDINFORMATIONTHEORY ,02 engineering and technology ,020202 computer hardware & architecture ,Antenna efficiency ,Radiation pattern ,law.invention ,Periscope antenna ,Hardware_GENERAL ,law ,0202 electrical engineering, electronic engineering, information engineering ,ComputerSystemsOrganization_SPECIAL-PURPOSEANDAPPLICATION-BASEDSYSTEMS ,Antenna (radio) ,business ,Monopole antenna ,Computer Science::Information Theory - Abstract
This paper deals with the design of a pattern reconfigurable antenna for implant devices. Radiation pattern reconfiguration means the modification of the spherical distribution of the radiation pattern of the antenna. Beam steering is an application of the pattern reconfiguration. When the direction of the main lobe of radiation pattern is changed it is called beam steering. A monopole antenna is chosen for design for its compact size. This antenna operates at the Med Radio band (401–406 MHz). The antenna to be designed for the implant devices has the dimensions of 28mm*11.5mm*0.6mm (193.2mm3). The designed antenna is fabricated on FR4 substrate (er=4.4). Pin Diodes are used to control the radiation pattern of the antenna. Advanced Design System (ADS) is adopted for antenna design and simulation.
- Published
- 2016
14. Design of effective low power image compression algorithm
- Author
-
V. Saravanan, B. Venkatalakshmi, and G. Epshi
- Subjects
Texture compression ,Computer science ,Real-time computing ,Fractal transform ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Data_CODINGANDINFORMATIONTHEORY ,02 engineering and technology ,Iterative reconstruction ,Lossy compression ,Fractal ,Fractal compression ,0202 electrical engineering, electronic engineering, information engineering ,Quadtree ,Computer vision ,Lossless compression ,business.industry ,020206 networking & telecommunications ,Data compression ratio ,Adaptive Scalable Texture Compression ,Block Truncation Coding ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Image compression ,Data compression ,Color Cell Compression - Abstract
For satellite communication, large amount of data storage and transmission are involved as the satellites send data all the time, all day. Storing all these data and analyzing them for various purposes is possible using small low cost memory devices only with the help of image compression. Image compression is the process of removing the redundant information from the image and it can be stored to reduce the storage size, transmission bandwidth and time. Image compression aims at removing duplication from the source image and is essential for applications such as transmission and storage in an efficient form. The objective of the work is to develop an efficient low power image compression algorithm which compress it with higher compression ratio in such a way that the output compressed image becomes compatible for satellite communication. The proposed system should own a light weight algorithm which has the characteristics of minimum power consumption, less compression time and should meet a higher compression ratio. To do image compression, quad tree fractal image compression and an adaptive fractal wavelet image compression algorithm are selected and their performance in terms of mean square error, ratio of compression and peak signal to noise ratio are evaluated.
- Published
- 2016
15. Bit error rate analysis of cognitive radio under fading channels
- Author
-
P. Saranya, V. Saravanan, and M. Divya
- Subjects
Engineering ,business.industry ,Data_CODINGANDINFORMATIONTHEORY ,Multiplexing ,Space–time block code ,Cognitive radio ,ComputerApplications_MISCELLANEOUS ,Bit error rate ,Electronic engineering ,Code (cryptography) ,Fading ,business ,Decoding methods ,Computer Science::Information Theory ,Rayleigh fading - Abstract
The Silver code has been proposed as a 2 × 2 space time block code that accomplish the optimal diversity — multiplexing gain tradeoff for a multiple antenna system in Cognitive Radio Network. In this paper the decoding methodology for the Silver code is analyzed, followed by performance analyzing with the Alamouti code and V-BLAST in Rayleigh fading environments. Simulation results show that the Silver Code outrun both the Alamouti code and V-BLAST at high SNR levels.
- Published
- 2016
16. Topologies of single phase Z source inverters for photovoltaic systems
- Author
-
V. Balaji, M. Arumugam, M. Aravindan, and V. Saravanan
- Subjects
Engineering ,business.industry ,020208 electrical & electronic engineering ,Photovoltaic system ,High voltage ,02 engineering and technology ,Inductor ,Computer Science::Other ,law.invention ,Capacitor ,Hardware_GENERAL ,law ,Hardware_INTEGRATEDCIRCUITS ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,Grid-tie inverter ,business ,MATLAB ,computer ,computer.programming_language ,Z-source inverter ,Voltage - Abstract
This paper deals with a new family of high voltage boost inverter topologies for single phase systems. The proposed topologies are derived from quasi Z source inverter (qZSI), switched inductor Z source inverter (SLZSI) and switched-inductor quasi Z source inverter (SLqZSI) which can realize buck/boost, inversion and power conditioning in a single stage with improved reliability. They provide continuous input current, reduced voltage stress on capacitors and lower current stress on inductors with high voltage boost inversion ability. Proposed topologies are analyzed in the steady state and their performances are validated using simulated results obtained in MATLAB/Simulink environment.
- Published
- 2016
17. A modified FXLMS algorithm for active impulsive noise control
- Author
-
N. Santhiyakumari and V. Saravanan
- Subjects
Adaptive filter ,Least mean squares filter ,Noise ,Computer science ,Control theory ,Noise control ,Stability (learning theory) ,Algorithm design ,Algorithm ,Active noise control - Abstract
Nowadays, numerous approaches have been introduced in literature for active noise control (ANC) systems. Surveys stated that filtered — x — Least Mean Square (FxLMS) algorithm appears best choice for controller in adaptive filter. The researchers have the tendency to improve its performance by enhancing the existing algorithm modification of step size as well as the structure of the existing method In this paper, a soft threshold based FxLMS algorithm for impulsive noise is proposed. Here threshold selection is based on Stein Unbiased Risk Estimation (SURE). This new algorithm improves the stability by fast convergence in presence of heavy tail impulsive noise. To demonstrate the performance of proposed method, simulation is conducted using MATLAB compared with Akthar's algorithm.
- Published
- 2015
18. A case study on harmonic distortion in textile industry
- Author
-
V. Suresh Kumar and V. Saravanan
- Subjects
Harmonic analysis ,Spectrum analyzer ,Electric power system ,Textile industry ,Nonlinear system ,Engineering ,Total harmonic distortion ,business.industry ,Harmonics ,Electronic engineering ,business ,Grid - Abstract
This paper discusses harmonic distortion in the modern fine yarn textile industrial power system. Operation of textile industry with large number of non linear load has an impact on the power quality at the connected electric network. Harmonic distortion is one of the most important phenomena, which affects the plant distribution network as well as grid performance. A case study is conducted in textile plant at Dindigul district of Tamil Nadu state in India to study the harmonic voltage and current distortions exists in the system. Class-A compliance power quality analyzer is used for measurement. The impact of harmonics is analyzed based on measurements and subsequent calculations. It is found that the harmonic distortion is invariably present with in the distribution system of the plant and introduces losses in the electric system.
- Published
- 2015
19. Design and development of Z source cascaded seven level inverter for solar photovoltaic system
- Author
-
S. Birundha and V. Saravanan
- Subjects
Engineering ,Total harmonic distortion ,business.industry ,Photovoltaic system ,Electrical engineering ,Maximum power point tracking ,Upgrade ,Electronic engineering ,Inverter ,business ,MATLAB ,computer ,computer.programming_language ,Voltage ,Z source - Abstract
This paper presents Z-source multilevel inverter (ZS-MLI) for photovoltaic (PV) operation. MLI maintain the gain of higher power efficiency and can run with higher voltage level. As the number of levels rises, the chose sinusoidal waveform can be attained for lesser total harmonic distortion (THD). Z-source network followed by the cascaded H-bridge multilevel inverter is used to upgrade the output voltage to a chose value. This work proposes an enhanced Z source network multilevel inverter. The simulation of single phase, seven level Z-source cascaded MLI is executed using MATLAB/SIMULINK.
- Published
- 2015
20. Cancer diagnosis using automatic mitotic cell detection and segmentation in histopathological images
- Author
-
V. Saravanan and G. Logambal
- Subjects
Computer science ,business.industry ,Feature extraction ,Pattern recognition ,Image segmentation ,medicine.disease ,Support vector machine ,Naive Bayes classifier ,Breast cancer ,medicine ,Segmentation ,Artificial intelligence ,business ,Classifier (UML) ,Grading (tumors) - Abstract
Cancer is a disease characterized by abnormal cell growth in the human body. Cancer is evaluated by histopathological examination, which is important for further treatment planning. The tubule formation, mitotic cell count and nuclear pleomorphism are three parameter used for cancer grading. Mitotic cell (MC) count is one of important factor in cancer diagnosis from histopathological images. MC detection is very challenging task in cancer diagnosis because mitotic cell are small objects with a large variety of shapes. The aim is to evaluate performances of SVM (Support Vector Machine) classifier and Bayesian classifier in cancer diagnosis. This proposed work consists of three modules: 1) Pre-processing, 2) MC detection and segmentation, and 3) MC classification. MC detection and segmentation are performed by Bayesian modeling and local region threshold method. The segmented mitotic cell is classified by both SVM classifier and Bayesian classifier and their performance is evaluated.
- Published
- 2015
21. Active noise control system for narrowband noise using FxLMS algorithm
- Author
-
E. Priya, V. Saravanan, and N. Santhiyakumari
- Subjects
Adaptive filter ,Least mean squares filter ,Computer science ,Control theory ,Robustness (computer science) ,Median filter ,Noise control ,Wideband ,Algorithm ,Impulse response ,Active noise control - Abstract
An active noise control (ANC) is a method used to reduce noise effectively. The filtered-X least mean square (FxLMS) algorithm is considered to be the best choice for its reduced complexity and robustness, especially for controller in adaptive filter. In this paper FxLMS algorithm is applied for narrow band noise. Also the system uses variable learning parameter (variable step size) to improve the convergence speed and reduction in noise. The objective of using variable step FxLMS (VSFxLMS) algorithm is to reduce the noise level in source with novel convergence speed in addition to secondary impulse response. Simulations are carried out using MATLAB. The reduction of noise in a narrow band is preferred so that effective and efficient noise control will be achieved and a comparative result with fixed step FxLMS (FSFxLMS) is shown.
- Published
- 2015
22. Node lifetime assessment based routing for wireless sensor networks
- Author
-
V. Saravanan, S V Vijayasree, and A. Pravin Renold
- Subjects
Static routing ,Key distribution in wireless sensor networks ,Dynamic Source Routing ,Computer science ,business.industry ,Multipath routing ,Mobile wireless sensor network ,Wireless Routing Protocol ,business ,Wireless sensor network ,Hierarchical routing ,Computer network - Abstract
Wireless sensor networks accommodate an oversized range of nodes with restricted battery power. As networks of those nodes are sometimes deployed unattended, network lifetime becomes a very important concern. Also estimating the lifetime accurately is a problem. Existing methods provide lifetime estimation technique using current consumption and battery capacity as impact factors. It failed to provide correct lifetime and not integrated with the routing method. In this paper we implemented a method to dynamically estimate the lifetime of the node with current consumption, battery capacity and temperature as impact factors. Additionally it is integrated with routing method. In this paper, Routing Protocol for Low power lossy networks (RPL) is employed. The target function of RPL uses rank and (Expected Transmission Count (ETX) as metrics to build the Direct Acyclic Graph and data transmission is done based on the lifetime of the node to extend the lifetime of the network. The simulation used is Cooja simulator in Contiki OS.
- Published
- 2015
23. A two level variable step size FxLMS algorithm for active noise control system
- Author
-
V. Saravanan and N. Santhiyakumari
- Subjects
Adaptive filter ,Reduction (complexity) ,Least mean squares filter ,Engineering ,Noise ,Filter design ,business.industry ,Control theory ,Noise reduction ,business ,Algorithm ,Active noise control - Abstract
Nowadays, various techniques have been addressed for active noise control (ANC) systems. In which filtered - x - Least Mean Square (FxLMS) algorithm widely used as controller in adaptive filter to update the filter coefficients. Further the performance of the FxLMS algorithm can be improved by changing the step size as well as the structure of the existing method. In this paper, a two level variable step size (TLVSS) FxLMS algorithm has been proposed for a typical ANC system. This new algorithm will improves the convergence as well as reduction in noise compared with conventional FxLMS algorithm. The MATLAB simulation is carried out to demonstrate the performance of the proposed algorithm compared with the conventional FxLMS algorithm.
- Published
- 2015
24. Cloud based automatic street light monitoring system
- Author
-
S. Vijayakumar, V. Saravanan, and M. Karthikeyan
- Subjects
Engineering ,business.product_category ,Lightning (connector) ,business.industry ,Real-time computing ,Cloud computing ,Track (rail transport) ,Power (physics) ,Base station ,Range (aeronautics) ,Control system ,Embedded system ,Street light ,business - Abstract
The proposed system for the cloud based automatic system involves the automatic updating of the data to the lighting system. It also reads the data from the base station in case of emergencies. Zigbee devices are used for wireless transmission of the data from the base station to the light system thus enabling an efficient street lamp control system. Infrared sensor and dimming control circuit is used to track the movement of human in a specific range and dims/bright the street lights accordingly hence saving a large amount of power. In case of emergencies data is sent from the particular light or light system and effective measures are taken accordingly.
- Published
- 2014
25. A framework for fraud detection system in automated data mining using intelligent agent for better decision making process
- Author
-
V. Saravanan, R. Jayabrabu, and J. Jebamalar Tamilselvi
- Subjects
Engineering ,business.industry ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,Commit ,Computer security ,computer.software_genre ,Sensor fusion ,Data science ,Intelligent agent ,Data visualization ,Anomaly detection ,Decision-making ,business ,Cluster analysis ,computer ,Data integration - Abstract
In global world fraud detection is one of the major problem and it is increasing every year. The statistical report of global economy says that nearly 30% of multi level people have been suffered by fraud in past year [1]. The term fraud involves one or more than one people, where they deliberately act secretly to take away some one valuable thing for their benefit. The statement fraud is as old as human being but it takes various forms with irrespective to situation. In recent days, the development of new technologies and techniques is also one of the victim advantages for criminals to commit fraud [2]. The investigators investigating techniques are mostly a traditional based methods of data analysis have long been used for fraud detection. Frauds are happened based on instance or incidents, but they are repeated offences using some methods (old and new), instances are more similar in content and appearance but they are non-identical while comparing. Fraud deduction is one of difficult process not only technically, but also in crime investigations. The method of fraud detection is based on simple comparisons, but also based on association, clustering, perdition and outlier detections. In consideration of these techniques this, paper proposed an automated fraud detection frame work is proposed to identify the fraud using intelligent agents, data fusion techniques and various data mining techniques.
- Published
- 2014
26. Model based essential interactions cluster mining in multivariate time
- Author
-
V. Saravanan and S. Chitra
- Subjects
Multivariate statistics ,medicine.diagnostic_test ,Brain activity and meditation ,Computer science ,computer.software_genre ,Task (project management) ,Correlation ,Functional neuroimaging ,medicine ,Algorithm design ,Data mining ,Cluster analysis ,Functional magnetic resonance imaging ,computer - Abstract
Functional magnetic resonance imaging or functional MRI (fMRI) is a functional neuroimaging procedure using MRI technology that measures brain activity by detecting associated changes in blood flow. The goal of fMRI data analysis is to detect correlations between brain activation and a task the subject performs during the scan. It also aims to discover correlations with the specific cognitive states, such as memory and recognition, induced in the subject. In this system, we propose a novel framework for clustering the essential fMRI signals based on their interactions and also correlation which is generated in a multivariate time series. To formalize this framework we cluster only Important Interactions based on the patient's medical records with the help of Essential Clustering Algorithm. The Essential clusters (EC) are then clustered again based on their dependencies on various brain regions. These EC's are grouped under specific models. The changes detected are mined based on the type of cluster grouped under a certain model. Our method shows that certainly increases the efficiency of the system along with increases in the effectiveness with minimal resource utilization.
- Published
- 2014
27. Zigbee based monitoring and control of melter process in sugar industry
- Author
-
S Arivoli, V Saravanan, and K. Valarmathi
- Subjects
Engineering ,Waste management ,Process (engineering) ,business.industry ,Sugar industry ,Process engineering ,business ,Monitoring and control - Published
- 2014
28. Iris authentication through Gabor filter using DSP processor
- Author
-
R. Sindhuja and V. Saravanan
- Subjects
Biometrics ,business.industry ,Computer science ,Data_MISCELLANEOUS ,Iris recognition ,Feature extraction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Wavelet transform ,Pattern recognition ,Hamming distance ,Haar wavelet ,Euclidean distance ,ComputingMethodologies_PATTERNRECOGNITION ,Gabor filter ,Computer vision ,Artificial intelligence ,business - Abstract
BIOMETRICs is the measurement of one's physiological or behavioural characteristics for security purposes. There are many types of biometrics available including finger print, voice scanner, retinal, iris recognition. Among this biometric authentication Iris recognition offers great amount of security and it has more advantages over other types of Biometric authentication methods. This paper analyses the iris biometric authentication as it have low error rates compared to other biometric authentication methods and its robustness of the algorithms provided. Various types of feature extraction are already available like Haar wavelet transform,1-d dyadic wavelet transform but they are limited to their application. This paper uses Gabor filters for feature extraction methods for iris authentication, which is more advantageous than already available methods. Matching algorithms such as binary Hamming distance and Euclidean distance is used in this paper for comparing the Feature extraction methods. After the Features are extracted it is implemented in Analog devices to achieve fast verification Performance.
- Published
- 2013
29. Design and development of pervasive classifier for diabetic retinopathy
- Author
-
S. M. Noorul Farhana, B. Venkatalakshmi, and V. Saravanan
- Subjects
Microaneurysm ,medicine.medical_specialty ,business.industry ,Feature extraction ,Diabetic retinopathy ,medicine.disease ,Screening programme ,Gabor filter ,Ophthalmology ,Diabetes mellitus ,medicine ,Computer vision ,Artificial intelligence ,business ,Classifier (UML) ,Retinopathy - Abstract
Diabetic retinopathy is a serious complication of diabetes mellitus which can eventually lead to blindness around 10% of patients with diabetes develop sight threatening retinopathy. Diabetic retinopathy is a condition for which treatment is available. The treatment is effective in preventing sight loss. Hence there has been a major impact on automatic screening for diabetic retinopathy. Detection of sight threatening retinopathy is an important aim of the screening programme. Four types of lesions namely microaneurysm, haemorrhages, soft exudates and hard exudates are predominant stages of diabetic retinopathy. For detection of these lesions feature extraction is an important attribute. Various types of feature extraction methods are already available. But these methods are limited in their application. Gabor filter is used for detection and differentiation of bright lesions, wavelet transform is used for detecting microaneurysm alone. As a consequence we have to rely on various feature extraction techniques to discriminate the pathologies. This paper uses AM-FM feature extraction method which is more advantageous than available methods. Contrary to other methods the same system is applied to detect red lesions (microaneurysm and haemorrhages) and hard exudates. After the features are extracted an automatic classification system based on partial least square is used to discriminate the pathologies.
- Published
- 2013
30. Automated red lesion detection in diabetic retinopathy
- Author
-
V. Saravanan, B. Venkatalakshmi, and Vithiya Rajendran
- Subjects
Microaneurysm ,medicine.medical_specialty ,Retina ,genetic structures ,Lesion detection ,business.industry ,Diabetic retinopathy ,Fundus (eye) ,medicine.disease ,eye diseases ,medicine.anatomical_structure ,Ophthalmology ,Diabetes mellitus ,medicine ,Human eye ,sense organs ,Complication ,business - Abstract
Diabetic retinopathy is damage to the retina of human eye which is caused by the complication of increase in blood glucose level which can eventually leads to blindness. The longer the patient has diabetes the higher the chance of developing diabetic retinopathy. DR is the deterioration of retinal blood vessel. Microaneurysm is one of the earliest symptoms of Diabetic Retinopathy. Microaneurysms occur as isolated dark red spots in retina due to the swelling of capillaries and weaking of blood vessels. The number of microaneurysms is used to indicate the severity of the disease. Earlier microaneurysms detection can reduce the incidence of blindness. Here the proposal is about an automated system for diabetic retinopathy detection in color fundus images obtained by fundus camera.
- Published
- 2013
31. Security issues in computer networks and stegnography
- Author
-
A. Neeraja and V. Saravanan
- Subjects
Steganography ,Pixel ,business.industry ,Computer science ,Network security ,Process (computing) ,Cryptography ,computer.file_format ,JPEG ,Information hiding ,Distortion ,business ,computer ,Computer network - Abstract
This paper reduces the detectable distortion in a joint photographic experts group (JPEG) file during data hiding process, by introducing new region selection rule. The new region selection rule considers three factors, i.e., the horizontal difference (HD), vertical difference (VD) and region size (RS). The JPEG image will be split into number of blocks and each pixel in it will be examined to calculate the variations. Depends upon the variation, the amount of secret information will be hide in an image. This proposed method of information hiding will help to solve the security issues in computer networks. The experimental result says that, the proposed system hides approximately 45% of secret information in addition, comparing existing methods without increasing detectable distortion.
- Published
- 2013
32. Dynamic handoff decision based on current traffic level and neighbor information in wireless data networks
- Author
-
V. Saravanan and A. Sumathi
- Subjects
Voice over IP ,Computer science ,Wireless network ,business.industry ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Real-time computing ,Data_CODINGANDINFORMATIONTHEORY ,Network monitoring ,Load balancing (computing) ,Network layer ,Handover ,Packet loss ,Interrupt ,business ,Computer network - Abstract
In wireless network, mobile nodes frequently perform layer 2 and layer 3 handoffs. The handoff may occur due to many factors like signal strength, load balancing, number of connections and frequencies engaged…etc. This frequent handoff may disturb the services of real-time applications such as voice over IP. Normally few milliseconds of interrupt will happen during the handoff process. This delay should be minimized for smooth performance. Information exchange between mobile nodes and network monitoring has to be done to achieve seamless layer 2 and layer 3 handoff. Increasing in packet loss rates and heavy traffic will initiate incorrect handoff. We propose a method for avoiding unbeneficial handoffs and to eliminate unwanted traffic.
- Published
- 2012
33. Comprehensive analysis of Z source DC/DC converters for DC power supply systems
- Author
-
M. Arumugam, P. Sureshkumar, R. Ramanujam, and V. Saravanan
- Subjects
Forward converter ,Voltage doubler ,Zigzag transformer ,Flyback converter ,Computer science ,Ćuk converter ,law.invention ,Power optimizer ,Rectifier ,Three-phase ,law ,Boost converter ,Charge pump ,Electronic engineering ,Inverter ,Transformer ,Pulse-width modulation ,Voltage ,DC bias - Abstract
The performance of Z source DC/DC converter topologies such as, traditional and Quasi Z source based DC/DC converter (both single and three phase) are analysed, and adopted at the front end of a power conditioning unit for DC power supply system. Various switching schemes like, simple boost PWM, carrier based PWM and space vector PWM control techniques are adopted to control the inverter switches. A new Quasi Z-source network based DC/DC converter topology employing Zigzag transformer is proposed. Transformers either, linear and/or zigzag are implemented individually, for step up as well as isolation functions. Voltage Doubler Rectifier (VDR) circuit is adopted to convert the high ac voltage from the transformer to dc voltage. Simulations are carried out in MATLAB/SIMULINK environment and the results are found to be promising for all the above switching schemes.
- Published
- 2012
34. Model based controller design for Melter process in sugar industry
- Author
-
K. Valarmathi and V. Saravanan
- Subjects
Model predictive control ,Temperature control ,Adaptive control ,Control theory ,Computer science ,Process (computing) ,PID controller ,Process control ,Crusher - Abstract
B-Melter is a process which is very difficult to control by classical means. The output (cane juice) from the crusher is send to the pan house having sections namely A, B and C pans. The high quality sugar are get from only A-pan house and the juice from the B-Pan house normally called as B seeds (sugar), consists of B seeds and B-heavy molasses, here the viscous of Massecuite is normally high. So, in order to maintain the viscous of Massecuite Melter process is normally employed in sugar industry. Here the viscous of Massecuite get deduced up to 60 to 65 brix by adding the steam and hot water with the Massecuite and this temperature control will ensure the melting ratio of the Massecuite. The Mathematical model for the above process and the tuning parameters for PID controller are designed using the conventional technique. In order to reduce the process time and other unwanted disturbances, the Model Reference Adaptive Control (MRAC) and Model Predictive Control [7] has been designed and the controller response are verified with the Matlab Simulation results and it can be adaptively adjusted online for varying state of the system and changing operating conditions. This paper focuses a method for developing control algorithm for the control of such a process.
- Published
- 2012
35. Efficient medical image compression technique for telemedicine considering online and offline application
- Author
-
V. Saravanan and Adina Arthur
- Subjects
Lossless compression ,Computer science ,business.industry ,Data compression ratio ,Data_CODINGANDINFORMATIONTHEORY ,computer.file_format ,Lossy compression ,Huffman coding ,symbols.namesake ,symbols ,Entropy (information theory) ,Computer vision ,Artificial intelligence ,business ,Lossless JPEG ,computer ,Data compression ,Image compression - Abstract
Telemedicine characterized by transmission of medical data and images between users is one of the emerging fields in medicine. Huge bandwidth is necessary for transmitting medical images over the internet. Resolution factor and number of images per diagnosis makes even the size of the images that belongs to a single patient to be very large in size. So there is an immense need for efficient compression techniques for use in compressing these medical images to decrease the storage space and efficiency of transfer the images over network for access to electronic patient records. This project summarizes the different transformation methods used in compression as compression is one of the techniques that reduces the amount of data needed for storage or transmission of information. This paper outlines the comparison of transformation methods such as DPCM (Differential Pulse Code Modulation), and prediction improved DPCM transformation step of compression and introduced a transformation which is efficient in both entropy reduction and computational complexity. A new method is then achieved by improving the perdiction model which is used in lossless JPEG. The prediction improved transformation increases the energy compaction of prediction model and as a result reduces entropy value of transformed image. After transforming the image Huffman encoding used to compress the image. As a result, the new algorithm shows a better efficiency for lossless compression of medical images, especially for online applications. The result is analyzed using MATLAB and implemented in hardware.
- Published
- 2012
36. Design of Optimal Amplifier for Wearable Devices
- Author
-
V. Saravanan, R. Anu Priya, and B. Venkatalakshmi
- Subjects
Engineering ,business.industry ,Amplifier ,Transconductance ,Bandwidth (signal processing) ,Electrical engineering ,Wearable computer ,law.invention ,law ,Hardware_INTEGRATEDCIRCUITS ,Electronic engineering ,Operational amplifier ,Resistor ,business ,Wearable technology ,Voltage - Abstract
The demand for health-monitoring wearable devices is increasing worldwide among scientists and clinicians. Low-power, low-voltage and low-noise attributes are important concerns that need to be addressed in circuits used for continuous personal health monitoring applications, to achieve long battery life. So the need for low-power low-voltage bio-amplifiers is increasing. These amplifiers receive the bio-signals through the bio-sensor, attached to the patient. Thus the amplifier acts as an interface to the wearable device and amplifies the difference between the two input signals from the bio-sensor. So, it is an important block at the front-end of the device. In this paper, a low- voltage OTA used for the bio-potential signal acquisition system is presented with MOS-Bipolar pseudo resistor configuration. Super source degeneration and bulk driving technique operated in sub-threshold region to achieve low-power and low- voltage criteria. The designed bio-amplifier is capable of amplifying the bandwidth range of 0.5Hz to 300Hz bio-potential signals like ECG, EEG and EMG etc.
- Published
- 2011
37. Graphical user interface for enhanced retinal image analysis for diagnosing diabetic retinopathy
- Author
-
V. Saravanan, B. Venkatalakshmi, and G. Jenny Niveditha
- Subjects
Computer science ,business.industry ,Process (computing) ,Window (computing) ,Image processing ,Diabetic retinopathy ,medicine.disease ,medicine.anatomical_structure ,Feature (computer vision) ,medicine ,Computer vision ,Artificial intelligence ,business ,Retinopathy ,Graphical user interface ,Optic disc - Abstract
Diabetic Retinopathy is a severe eye disease caused due to chronic diabetes. The project aims at developing a novel solution for easy diagnosis of Diabetic Retinopathy, which is a serious condition prevailing among the diabetic patients. The literature study reveals that, direct inspection of retinal image captured by fundus camera is not sufficient for the diagnosis of diabetic retinopathy at initial stages and also requires periodic screening which is a time consuming process. Instead if the image is processed and relevant parameters such as hard exudates are detected based on its features like sharp edges and color are highlighted by the implementation of image processing techniques and the graphical user interface created using MATLAB 7.8, will aid the physicians in quick examination. The methodology involves extraction of optic disc (OD) which is a feature found in retina is carried out as a initial stage of extracting hard exudates which is followed by detection of yellowish objects and sharp edged objects. Finally each process involved in detecting hard exudates take up a icon in a single window called Graphical User Interface(GUI) window which will be an exclusive window for detecting hard exudates. The processed resultant image can be examined comfortably at its earlier stages and thus reducing the examining time.
- Published
- 2011
38. An improved clustering technique based on statistical model preprocessing for gene expression dataset
- Author
-
V Saravanan and N Tajunisha
- Subjects
Fuzzy clustering ,business.industry ,Computer science ,Correlation clustering ,k-means clustering ,Statistical model ,Pattern recognition ,computer.software_genre ,Principal component analysis ,Preprocessor ,Domain knowledge ,Artificial intelligence ,Data mining ,Cluster analysis ,business ,computer - Abstract
Data mining has become an important topic in effective analysis of gene expression data due to its wide application in the biomedical industry. Within a gene expression matrix there are usually several particular macroscopic phenotypes of samples. Selection of genes most relevant and informative for certain phenotypes is an important aspect in gene expression analysis. Currently most of the research work focuses on the supervised analysis, relatively less attention has been paid to unsupervised approaches which are important when domain knowledge is incomplete or hard to obtain. The standard k-means clustering algorithm is used for many practical applications. But its output is quite sensitive to initial positions of cluster centers. In this paper, we present a new framework for clustering microarray data with informative genes. We proposed statistical method to find informative genes and we have proposed a method to find initial centroid for k-means clustering. Here in our work, initial clusters are formed with fixed initial centroid and then we have used statistical method to find informative genes which are used in turn to obtain an improved clustering. By comparing the result of original and new approach, it was found that the results obtained are more accurate.
- Published
- 2010
39. An Increased Performance of Clustering High Dimensional Data Using Principal Component Analysis
- Author
-
V. Saravanan and N. Tajunisha
- Subjects
Clustering high-dimensional data ,Computer science ,business.industry ,Dimensionality reduction ,k-means clustering ,Centroid ,Pattern recognition ,computer.software_genre ,ComputingMethodologies_PATTERNRECOGNITION ,Dimension (vector space) ,Principal component analysis ,Data mining ,Artificial intelligence ,business ,Cluster analysis ,computer ,Curse of dimensionality - Abstract
In many application domains such as information retrieval, computational biology, and image processing the data dimension is usually very high. Developing effective clustering methods for high dimensional dataset is a challenging problem due to the curse of dimensionality. The k-means clustering algorithm is used for many practical applications. But it is computationally expensive and the quality of the resulting clusters heavily depends on the selection of initial centroid and dimension of the data. The accuracy of the resultant value perhaps not up to the level of expectation when the dimensions of the dataset is high because we cannot say that the dataset chosen are free from noisy and flawless. So it is required to reduce the dimensionality of the given dataset in order to improve the efficiency and accuracy. This paper proposed a new approach to improve the accuracy of the cluster results by using PCA to determine the initial centroid and also to reduce the dimension of the data.
- Published
- 2010
40. A New Variable Threshold Based Active Noise Control Systems for Improving Performance
- Author
-
V. Saravanan, A. Krishnan, and P. Babu
- Subjects
symbols.namesake ,Noise measurement ,Rate of convergence ,Noise (signal processing) ,Control theory ,Gaussian noise ,Computer science ,symbols ,Wavelet transform ,Filter (signal processing) ,Active noise control - Abstract
Several approaches have been introduced in literature for active noise control (ANC) systems. Since FxLMS algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. In this paper, the existing FxLMS algorithm is modified which provides a new structure for improving the tracking performance and convergence rate. The secondary signal y(n) is dynamically thresholded by wavelet transform to improve tracking. The convergence rate is improved dynamically by varying the step size of the error signal.
- Published
- 2010
41. Knowledge acquisition and storing learning objects for a learning repository to enhance E-learning
- Author
-
E. Kirubakaran, B. Anuja Beatrice, and V. Saravanan
- Subjects
Flexibility (engineering) ,Multimedia ,business.industry ,Computer science ,Learning community ,E-learning (theory) ,Reuse ,computer.software_genre ,Knowledge acquisition ,Outsourcing ,World Wide Web ,Knowledge base ,The Internet ,business ,computer - Abstract
E-learning the Web enabled education and training is one of the key research areas in the recent years. Knowledge which is highly structured form of information is required to perform complex tasks. E-learning affords new opportunities and a larger amount of flexibility for the entire learning community. The current effort in research is to standardize the resources and to reuse it in various diverse educational contexts. This paper deals about how the knowledge can be acquired from the experts and stored in the knowledge base and proposes a methodology for reusing the learning resources and storing the resources in the repository
- Published
- 2010
42. An Effective Classification Model for Cancer Diagnosis Using Micro Array Gene Expression Data
- Author
-
V. Saravanan and R. Mallika
- Subjects
Cancer classification ,Computer science ,business.industry ,Small number ,Cancer ,Pattern recognition ,Micro array ,medicine.disease ,Support vector machine ,Statistical classification ,ComputingMethodologies_PATTERNRECOGNITION ,Gene selection ,medicine ,Artificial intelligence ,business ,Predictive modelling - Abstract
Data mining algorithms are commonly used for cancer classification. Prediction models were widely used to classify cancer cells in human body. This paper focuses on finding small number of genes that can best predict the type of cancer. From the samples taken from several groups of individuals with known classes, the group to which a new individual belongs to is determined accurately. The paper uses a classical statistical technique for gene ranking and SVM classifier for gene selection and classification. The methodology was applied on two publicly available cancer databases. SVM one-against- all and one-against-one method were used with two different kernel functions and their performances are compared and promising results were achieved.
- Published
- 2009
43. A Framework of an Automated Data Mining System Using Autonomous Intelligent Agents
- Author
-
J. Rajan and V. Saravanan
- Subjects
business.industry ,Process (engineering) ,Data stream mining ,Computer science ,media_common.quotation_subject ,computer.software_genre ,Data science ,Intelligent agent ,Software ,Data visualization ,Software agent ,Quality (business) ,User interface ,business ,computer ,media_common - Abstract
Data mining is the analysis of (often large) observational data sets to find unsuspected relationships and to summarize the data in novel ways that are both understandable and useful to the data owner. In other words data mining is a process of finding previously unknown, profitable and use patterns hidden in data, with no prior hypothesis. Automated Data Mining and modeling software gives managers a tool to perform analyses that otherwise would need to be handled by a highly trained researcher. Automated data mining methodologies is not to provide more accurate results but strives to empower non-expert users to achieve reasonable results with minimum effort. Data mining is a difficult and laborious activity that requires a great deal of expertise for obtaining quality results. We need new methods for intelligent data analysis to extract relevant information with less effort. With the use of the autonomous intelligent agents several data mining steps are possibly be automated. The goal is to empower non-expert users to achieve reasonable results with minimum effort. In this paper we present an automated approach for a data mining system using autonomous intelligent agents.
- Published
- 2008
44. Handling Noisy Data using Attribute Selection and Smart Tokens
- Author
-
V. Saravanan and J.J. Tamilselvi
- Subjects
Computer science ,Data quality ,Data integrity ,Process (computing) ,Algorithm design ,Feature selection ,Data mining ,Security token ,computer.software_genre ,computer ,Data warehouse ,Data integration - Abstract
Data cleaning is a process of identifying or determining expected problem when integrating data from different sources or from a single source. There are so many problems can be occurred in the data warehouse while loading or integrating data. The main problem in data warehouse is noisy data. This noisy data error is due to the misuse of abbreviations, data entry mistakes, duplicate records and spelling errors. The proposed algorithm will be efficient in handling the noisy data by expanding abbreviation, removing unimportant characters and eliminating duplicates. The attribute selection algorithm is used for the attribute selection before the token formation. An attribute selection algorithm and token formation algorithm is used for data cleaning to reduce a complexity of data cleaning process and to clean data flexibly and effortlessly without any confusion. This research work uses smart token to increase the speed of the mining process and improve the quality of the data.
- Published
- 2008
45. Hash Partitioned apriori in Parallel and Distributed Data Mining Environment with Dynamic Data Allocation Approach
- Author
-
Sujni Paul and V. Saravanan
- Subjects
Hash join ,Distributed Computing Environment ,Association rule learning ,Relational database ,Computer science ,Dynamic data ,Hash function ,Parallel algorithm ,Resource allocation (computer) ,Data mining ,computer.software_genre ,computer - Abstract
Parallel system is mainly composed of parallel algorithms which are cost optimal. In this paper a parallel algorithm the hash partitioned apriori (HPA) is taken into consideration. HPA partitions the candidate itemsets among processors using a hash function, like the hash join in relational databases. HPA effectively utilizes the whole memory space of all the processors, hence it works well for large scale data mining in a parallel and distributed environment. The optimization technique of dynamic data allocation is discussed for the execution of this application. This technique is applied in a parallel and distributed environment. Writing parallel data mining algorithms in a distributed environment is a non-trivial task. The main purpose of the proposed method is to meet certain challenges associated with parallel and distributed data mining such as (i) minimizing I/O (ii) Increasing processing speed (iii) Communication cost.
- Published
- 2008
46. Harmonic distortion in a modern cement industry-a case study
- Author
-
V. Suresh Kumar, P.S. Kannan, and V. Saravanan
- Subjects
Electric power system ,Spectrum analyzer ,Nonlinear system ,Engineering ,Total harmonic distortion ,business.industry ,Harmonics ,Harmonic ,Electronic engineering ,Grid ,business ,Voltage - Abstract
Operation of cement industry with large number of non linear load has an impact on the power quality at the connected electric network. Harmonic distortion is one of the most important phenomena, which affects the plant distribution network as well as grid performance. This paper provides an in depth discussion on harmonic distortion taking place in the cement plant power system. A case study, to determine where a significant amount of harmonic currents or voltages exists in the system is performed using power quality analyzer. From these measurements and subsequent calculations, the impact of harmonics is analyzed. It is found that the harmonic distortion is invariably present with in the distribution system of the plant as well as in the grid side.
- Published
- 2006
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.