Asian Science Citation Index is committed to provide an authoritative, trusted and significant information by the coverage of the most important and influential journals to meet the needs of the global scientific community.  
ASCI Database
308-Lasani Town,
Sargodha Road,
Faisalabad, Pakistan
Fax: +92-41-8815544
Contact Via Web
Suggest a Journal
Articles by A. Kannan
Total Records ( 19 ) for A. Kannan
  P.V. Elvizhy , A. Kannan , S. Abayambigai and A.P. Sindhuja
  In this study, an automatic food recognition system using multi class support vector machine classifier is presented. For classification of the food item, four features are considered viz. Size, shape, color and texture. In the works carried out previously, only single food item types were considered but in this study mixed foods are also taken into account. To detect mixed foods, Region Of Interest (ROI) method is used. Since, four important features are considered for classification, this system provides high accuracy. The system is built for food image processing and uses nutritional fact Table for calorie measurement. Three techniques are adopted to extract features. They are: Scale Invariant Feature Transformation (SIFT) method-extracts shape of images. Gabor method-extracts the texture feature. Color histogram-extracts color features of an image. After extracting these features, the image is classified using multiclass SVM to identify the class of provided food image. By finding, the area and volume of the food samples, calorie values are calculated. The multiclass SVM methods like “one-against-one” and “one-against-all” methods are compared with binary SVM against the food samples. Furthermore, the results show that the proposed methods become favorable as the number of classes are increased
  S. Jothi Muneeswari , S. Ganapathy and A. Kannan
  In Wireless Sensor networks, there is a limitation on the packet delivery ratio and energy consumption. Moreover, the data gathering also needs improvement by the use of intelligent techniques. In order to overcome these problems, we propose a new Particle Swarm Optimization (PSO) and cluster based routing and data gathering algorithm for effective data collection and routing in mobile wireless sensor networks. Moreover, a combined metric is used for intelligent routing by combining the traffic load, packet delivery ratio and residual energy. Then, the fitness function of proposed PSO based routing algorithm is able to provide an optimal value by considering the combined metric. The main advantage of the proposed routing algorithm is that it helps to perform load balancing for routing with respect to cluster loads. The cluster head aggregates the data collected from its members and transmits the same to the base station. Through simulation results, we show that the packet delivery ratio is increased and energy consumption is reduced when the proposed routing algorithm is used along with data gathering using PSO.
  K. Selvakumar , L. Sai Ramesh and A. Kannan
  Trust management is an important aspect to enhance pattern and increase the secured packet transmission in Wireless Sensor Networks (WSNs). The kernel of the trust management is estimation of trust. If a trust prediction model is not enough to withstand against the malicious nodes from the network it affects performance as well as energy consumption of the entire system. This research study presents a novel node trust estimation model which is very active and robust in contact with malicious nodes. Furthermore, this research combines fuzzy expert system based inference mechanism with trust to achieve the secured data transmission which optimizes the energy of the WSNs. Extensive simulations have been conducted in this research and evaluation of results proves that this model is better than the other existing trust models.
  S. Muthurajkumar , M. Vijayalakshmi and A. Kannan
  In a cloud computing, heterogeneous multi-core server processors are present across clouds and data storage centers. Therefore, the total performance of the cloud system can be optimized by providing effective and secured storage and retrieval methods. This paper proposes a new algorithm to optimize power and performance based on load distribution and balancing methods for cloud computing. The proposed algorithm performs performance optimization using speed, time, energy and security. This method has been implemented in a cloud environment and the efficiency of the proposed scheme is compared with the existing schemes and it is found the storage and energy are optimized.
  N. Leema , H. Khanna Nehemiah , A. Kannan and J. Jabez Christopher
  Diabetes is a major health problem, the society faces today. Diagnosis and treatment of diabetes will improve the quality of life of affected individuals. Clinical Decision Support System (CDSS) serves as an aid for junior clinicians to diagnose diabetes in the absence of an expert diabetologist. This research aims to develop a CDSS diagnose the presence or absence of Gestational Diabetes Mellitus (GDM). The framework used to develop the CDSS has three subsystems, namely preprocessing subsystem, training subsystem and classification subsystem. Noisy values are handled by the preprocessing subsystem. The training subsystem fuzzifies the preprocessed data and constructs the hidden nodes of the Radial Basis Function Neural Network (RBFNN) using the Gaussian Membership Function. The exact interpolation property of the RBFNN is used to extract the weights between the hidden layer and the output layer. During RBFNN training, each instance in the training set is considered as fuzzy rules. The extracted weights are used to prune the generated fuzzy rules. Finally, the pruned rules are stored in a knowledge base. The Fuzzy Inference System uses the rules from the knowledgebase to classify the samples in the testing set. The CDSS for Gestational Diabetes Mellitus attains an overall accuracy of 88.31% with 79.31% sensitivity and 93.75% specificity. Our CDSS yields comparable classification performance when compared to the works of other researchers in the past decade. The CDSS serves as a second source of opinion for junior clinicians for the diagnosis of GDM. The classification frameworks used in this CDSS can be adopted for other clinical datasets.
  G. Malini , T. Mala and A. Kannan
  In the past few years, E-learning has become one of the most sought powerful twenty first century tools. There is a need to redesign the current educational system to meet the internet and network based technology enabled education using e-learning. In this study, a new approach for cloud based content delivery is proposed in which the E-learning content is stored in a distributed manner across the cloud, thereby providing easier and faster access of course materials, along with fault tolerance feature. In this model, the future content requests of the users are predicted in advance, depending on the history of requests made by the users with the help of Markov model based prediction and are provided. The predicted content, if not available in the user’s cloud site storage is replicated. Through the experiments carried out in this research, it has been shown that the system is scalable for large number of users and the prediction deployed achieves minimum access time and response time of the users.
  G. Saranya , H. Khanna Nehemiah , A. Kannan and S. Vimala
  Bad smells are the symptoms of code decay which leads to the severe maintenance problem. Shotgun surgery is a smell where a change in a class may cause many small changes to other different classes. There are several approaches that identify bad smells based upon the definition of rules and change history information. These rules are the combination of software metrics and threshold values which sometimes not able to detect the code decay such as shotgun surgery. Since, it is difficult to find the best threshold value for rule based detection and also finding the best combination of metrics from the historical information seems to be difficult. To detect the shotgun surgery, the co-change should be feasible. In that case, the need for having sufficient history of observable co-changes, without which the approach of change history is not possible. Therefore, these techniques cannot report the accurate instance of smells to detect the shotgun surgery bad smell. So as an alternate, in this study, a framework similarity measure distribution modelfor detecting shotgun surgery bad smell for object oriented program without the need for change history information is proposed. The framework is experimented on HSQLDB, TYRANT, XERCES-J and JFREE CHART open source software. To enable the detection of Shotgun surgery certain class files of the software under experimentation are modified. The results obtained through this framework are compared with the results obtained from the bad smell detection tools namely, inFusion and iPlasma in terms of precision and recall.From the results it is inferred that the shotgun surgery can be detected more accurately using this proposed approach.The proposed framework improves the maintainability by detecting the bad smell shotgun surgery.
  R. Rakesh and A. Kannan
  Web security pertains with the proposal of efficient security measures to guard against attacks carried over the internet. Different attacks such as denial of service, cross site scripting, injection, authentication and session management, social engineering, etc., exist as a hindrance to web services and end users. Phishing is a kind of social engineering attack. Phishing is a malicious activity where personal and confidential information from the end user is obtained by luring them towards an illegitimate web page or Uniform Resource Locator (URL). In this study, a novel approach to anti-phishing using Theil decision tree classifier is proposed, where the proposed algorithm computes optimal node values, essential in identifying the splitting attribute for the constructed decision tree which is then used to classify malicious web pages or URL’s.
  Maganti Venkatesh , M. Krishnamurthy , A. Swarupa and A. Kannan
  In a traditional classroom when a good teacher observes that a particular student is finding some learning material to be hard to comprehend, he/she offers simpler materials and simpler explanations. The teacher comprehends the complexity levels of the available learning materials, customizes the set of materials offered to a student based on the classroom or homework performance of that student. In the age of open digital learning, available learning materials have grown by orders of magnitude. Also, the likelihood of a good human teacher paying direct attention to an average individual learner is very low now. Thus, it has become necessary to build automated systems that can comprehend the complexity levels of learning materials, so as to auto-select and auto-suggest suitable sets of learning materials for each individual learner. This study is an attempt to bring personalization into massive online learning by auto tagging the content of topics and courses and also auto-suggesting suitable materials based on the performance of the learner.
  S. Behin Sam , S. Sujatha , A. Kannan and P. Vivekanandan
  Distributed Denial of Service (DDOS) attacks have emerged as a prevalent way to shut an organization off from the internet and has resulted in financial losses to the same. In the case of DDOS attack, an adversary attempts to disconnect network elements by disabling the communication links or nodes. The effectiveness of DDOS defenses depends on factors such as the specific attack scenario and various characteristics of the network routers. However, little research has focused on the nature of the network`s topology that can also be an effective DDOS defense. This study focuses on the adversaries who try to disable the communication links. It stresses the need for either a strong connectivity or m-connectivity among the nodes (routers). This approach will discourage the adversary from attempting to disable the network, as the cost for causing the damage will increase. Validation of this approach was performed using a network simulator and the results are shown.
  G. NaliniPriya , A. Kannan and P. AnandhaKumar
  In Medical Information Systems, the data available for the learning and prediction are multivariate in nature. Some of the classification models which were generally used in the design of medical decision support systems could not provide a good performance. In this study, researchers address the ways to improve the performance of a supervised learning based classification algorithm. For achieving this, researchers propose the use of statistical technique for performing effective decision making in medical application, screening and manipulating the training samples with little bit of Gaussian Distribution Random Values (GDRV) before using the data for training the neural network. This study present, a way to improve the performance of a neural network based classification model through the proposed biased training algorithm which has been evaluated with the Coronary Artery Disease (CAD) data sets taken from University California Irvine (UCI). The performance has been evaluated with standard metrics.
  S. Vimala , H. Khanna Nehemiah , R.S. Bhuvaneswaran , G. Saranya and A. Kannan
  Success of any organization is based on the quality of the information system which undergoes many alterations during its life cycle. Hence, it can be quoted as an example for a live entity. Database is a core component in any information system and it gets affected due to change in business logic. A popularly used data model in any organization is the relational model. Changes that are made in relational schema also modify the queries that access the relations. It is really difficult to identify the set of queries that access the same relation in the case of large information system. Similarity measures concept is one of the techniques that are applied in object oriented programming for refactoring is used in this proposed system for restructuring procedures into packages by taking PL/SQL code as input. The proposed system groups those queries or procedures that access the same relations into a single package. The objective of this research is to determine whether the proposed methodology can be used as a mechanism to improve the maintainability of PL/SQL code. This process of packaging is done by applying game theory so as to increase understandability and maintainability of the system.
  P. Indira Priya , D.K. Ghosh , A. Kannan and S. Ganapathy
  Genetic algorithms are helpful to make effective decisions using suitable fitness functions. They can be used to perform both clustering and classification. However, Clustering algorithms enhanced only with genetic operators are not sufficient for making decision in many critical applications. In this study, researchers propose a new user behaviour analysis model by combining Genetic algorithm with Weighted Fuzzy C-Means Clustering Algorithm (GNWFCMA) for effective clustering. The proposed clustering algorithm is used to improve the classification accuracy by providing initial groups. In addition, researchers use a five factor analysis also for effective clustering. Finally, researchers use a neuro-fuzzy classifier for classifying the data. The experimental results obtained from this study shows that the clustering results when combined with classification algorithm provides better classification accuracy when tested with Weblog dataset.
  E. Uma and A. Kannan
  The internet has used widely by the users to handle sensitive data like credit card id, account id and pan number. Many scripting attacks like Cross Site Scripting (XSS), parameter tampering and buffer over flow attacks are targeted to the sensitive data. This XSS attacks are executed by the user’s interface. These attacks can be used to run malicious code on or steal personal information in web. The XSS attacks are challenging issue for the internet user because it is easily generated by the attacker but very tough to prevent it from an input filter because of its lacking techniques in the existing systems. In this study, new filtering policy has been proposed for detecting and filtering attacks. The system has been implemented with intermediary services to segregate the untrusted data and trusted data from the input. The proposed XSS filter tested with all possible attacks for verifying the robustness of filtering policy. The results show that the proposed filtering policy is very strong to refine the malicious SOAP message which contains attacks such as XSS. Researchers demonstrated the implementation and accuracy of the approach through extended testing using real-world cross-site scripting exploits.
  Anita Titus , H. Khanna Nehemiah and A. Kannan
  A Computer Aided Diagnosis (CAD) System for the detection of Interstitial Lung Diseases (ILDs) like emphysema, ground glass opacity, fibrosis and micro nodules based on the texture analysis of lung Computed Tomography (CT) slices has been proposed. The texture features are extracted from the lung region using the Gray Level Histogram (GLH). Quincunx Wavelet Transform (QWT) is applied to the lung regions and the distribution of the wavelet coefficients is modeled using the Gaussian Mixture Model (GSM) of two Gaussians with fixed mean and variable standard deviations. The standard deviations of the two Gaussians are estimated using the Expectation-Maximization (EM) algorithm. The feature vectors constructed from the texture features extracted using the GLH and the QWT are applied to the Support Vector Machine (SVM) classifier. The SVM classifier is optimized using particle swarm optimization and is used to classify the different lung tissue patterns. The classifier achieved an overall precision of 90.23%, accuracy of 96.01% and misclassification rate of 3.99%.
  J. Senthilkumar , D. Manjula , A. Kannan and R. Krishnamoorthy
  In this study, researchers propose a novel Automatic Supervised Feature Selection and Discretization algorithm to enhance the classification of medical images (mammograms). The proposed method consists of a new algorithm called, NANO for a filter based supervised feature selection and discretization. This algorithm solves two problems, viz., feature discretization and selection in a single step. An important contribution of the proposed algorithm is the reduction of irrelevant items to be mined. NANO selects the relevant features based on the average global inconsistency and average global cut point measures, speeding up the medical image diagnosis framework. Two set of experiments have been performed to validate the proposed method. Experiments are carried out to validate the performance of NANO algorithm in the task of feature selection and discretization. Performance evaluation was done for the first experiments using precision and recall metrics obtained from the query and retrieved images. The second set of experiments aim at validating the classification accuracy. From the experiments, it is observed that the proposed method shows high sensitivity (up to 98.64%) and high accuracy (up to 96.95%).
  C. Sunil Retmin Raj , H. Khanna Nehemiah , D. Shiloah Elizabeth and A. Kannan
  Segmentation of lung parenchyma is a challenging task in the Computer Aided Diagnosis (CAD) of lung disorders using chest Computed Tomography (CT). In this research, a Two Phase Supervised algorithm has been proposed for segmentation of lungs in chest CT slices. In the first phase, the initial lung region is obtained by applying a combination of iterative thresholding and morphological operations. The shape features of the resulting lung region are applied to a decision tree classifier that is constructed from a training dataset to determine whether the segmented lung forms a complete lung. In the second phase, if the initial lung is complete the lung region is filled with lung tissue if the initial lung is not complete, the lung region is determined by a series of operations. First, the longest of the two connected components is determined. The longest connected component is then folded and translated horizontally. The two lung regions are then converted to a single connected component and the convex hull is obtained. The convex hull is interpolated to obtain the outer convex edge. The outer convex edge thus obtained is superimposed on the binary image obtained by folding and translation and used as the initial contour for the Active Contour Model (ACM). The ACM algorithm is iterated until the distance between the contours of two subsequent iterations becomes lesser than a threshold. It is also ensured that the number of components does not exceed two. This method is adaptive in that the number of iterations of ACM is not fixed and is based on the image for which it is applied. This method of lung segmentation has been compared with the conventional Iterative Thresholding Method, Convex Hull Based algorithm and Supervised algorithm for segmentation. The maximum overlap achieved with all the four methods is 100% while the minimum achieved with the proposed method is 55.3%, conventional iterative thresholding method is 37.83%, Convex Hull Based algorithm is 25.82% and Supervised algorithm is 54.25%. Thus, the proposed Two-Phase Supervised Method is found to be better than the other three methods with which the comparison is done.
  T.V. Raja , R.T. Venkatachalapathy and A. Kannan
  A study was conducted to assess the influence of certain genetic and non-genetic factors on the birth weight and to estimate the genetic and phenotypic parameters of birth weight in crossbred calves raised under organized farm conditions. The data on performance record of 713 calves born over a period of 11 years from 1995-2005 at Cattle Breeding Farm, Thumburmuzhy, Kerala was statistically analyzed using the mixed model least square and Maximum likelihood computer program PC-2 of Harvey. The least square mean for birth weight of calves was found to be 27.83±0.48 kg. The male calves (28.50 kg) were found to be heavier than the female calves (27.16 kg).
  P. Soundarapandian and A. Kannan
  In the present study, the nursery culture was practiced in 4 different farms (A, B, C and D). The salinity was ranging between 0-4 ppt and alkaline pH (7.0-8.3) was maintained by adding lime in all 4 ponds. The temperature fluctuations were not beyond the optimum (26-32°C) and the dissolved oxygen level was ranging between 3.5 to 6.5 ppm throughout the culture period in all four farms. Optimum transparency (31-42 cm) was maintained throughout study period. The food conversion ratio of the present study ranges from 1:1.23 to 1:1.47. The survival rates of farms A, B, C and D were 78, 83, 80.32 and 81%, respectively. The maximum average body weight of 22.0 g in male population and 19.70 g in female population was reported in farm A, where the crop pattern is 5 seeds/m2/110 days. Whereas in farms B, C and D the crop pattern was 11 seeds/m2/70 days, 15 seeds/m2/70 days and 20 seeds/m2/70 days, respectively. The average body weight of females in the farms B, C and D are 4.16, 5.30 and 3.75, respectively. In farm A, all the female’s population with the average body weight of 19.70 g was sold out in good price. The total (482 kg) production also high in farm A where as B, C and D farms it was 85, 277.38 and 280.62 kg, respectively. Extension of nursery culture up to 110 days, proper water quality and feeding management in nurseries and low stocking density are reported to get the maximum growth and total production in farm A. This is highly profitable business those who are maintaining their own ponds. The nursery culture period in farms B, C and D was 70 days. The male production in farms B, C and D were 7,400, 20,060 and 24,652 numbers, respectively. So, the culture period of 70 days and semi-intensive type of culture is found more profitable for male scampi seed selling farmers.
Copyright   |   Desclaimer   |    Privacy Policy   |   Browsers   |   Accessibility