Volume 1 Issue 1 by Editor IJMTER
Energy is one of the fundamental forms of resource. Energy includes quantity as well quality. All... more Energy is one of the fundamental forms of resource. Energy includes quantity as well quality. All the forms of energy are inter-convertible. Inter-conversions between thermal and mechanical energy has been studied and analyzed in the present work. Exergy is defined as the energy that can be extracted from a conversion system. When the system is allowed to interact with a large reservoir at some reference state, this work potential is the exergy of the system. The proposed research is focused on exergy analyses based on technical and economic aspects of such resource conversion systems and to optimize various performance parameters used in these systems by taking finite heat exchange areas and temperature differences in consideration. The objective behind it is to understand the irreversibilities in the conversion systems in order to effectively utilization of energy.
A great challenge in the biomedical engineering is the non-invasive assessment of the physiologic... more A great challenge in the biomedical engineering is the non-invasive assessment of the physiological changes occurring inside the human body. Specifically, detecting the abnormalities in the human eye is extremely difficult due to the various complexities associated with the process. Retina is the significant part of the human eye which can reflect the abnormal changes in the human eye. Hence, retinal images captured by digital cameras can be used to identify the nature of the abnormalities affecting the human eye. Retinal image analysis has gained sufficient importance in the research arena due to the necessity for disease identification techniques. Abnormality detection using these techniques is highly complex since these diseases affect the human eye gradually. Conventional disease identification techniques from retinal images are mostly dependent on manual intervention. Since human observation is highly prone to error, the success rate of these techniques is quite low. Since treatment planning varies for different abnormalities, the accuracy of the identification techniques must be significantly high. Lack of accuracy in these techniques may lead to fatal results due to wrong treatment. Hence, there is a significant necessity for automation techniques with high accuracy for retinal disease identification applications. One of the techniques used in this paper is Random Forest with and without sampling. With sampling the accuracy is increased and is 74.93% and without sampling it is 52.05%.
Power transformers plays most crucial in power system. It transfers of power from one voltage lev... more Power transformers plays most crucial in power system. It transfers of power from one voltage level to another. Power transformer breakdown or damage may interrupt power transmission & distribution operation Under continuous operation confront to the electrical and thermal stresses. Fault in the power transformer may lead to the power supply interruption The Dissolved Gas Analysis to detect incipient faults has been improved using artificial neural networks and is compared with Rogers ratio method with available samples of field information. Dissolved gas analysis (DGA) is a reliable technique for detecting the presence of incipient fault conditions in oil immersed transformers. In this method the presence of certain key gases is monitored. The various analysis methods are : Rogers ratio, IEC ratio, Doernenburg, Duval triangle, key gas, artificial neural network (ANN) method. In this paper the various DGA methods are evaluated and compared. The key gases considered are hydrogen, methane, ethane, ethylene, acetylene.
Abstarct-Shopping is an integral part of everyday life. The purpose of creating shopping lists is... more Abstarct-Shopping is an integral part of everyday life. The purpose of creating shopping lists is to expeditiously manage time and resources while shopping and remember something important to buy. These shopping lists could range from grocery shopping to shopping food, accessories, clothes, jewelry, gifts for special occasions, etc. The paper-pen based approach is earlier or effective means of creating shopping lists. With the arrival of the mobile phone era and the dominant use of mobile apps to perform everyday tasks, the way to use apps for creating and managing shopping lists is becoming more fashionable by the day. There are hundreds of " Shopping List " apps for both Android phones and iPhone. Also, the design of the mobile app interface modifies a significant share in human computer interaction (HCI) research. The goal of this project is to study the transformation from paper-pen based approach to the mobile app approach, check up on user experience with the most popular shopping list apps for both Android and iPhone; and propose new ways to enhance this experience both in terms of usability of app and user interface improvements.
Cloud computing a new era opens a new era as it can provide various elastic and scalable IT servi... more Cloud computing a new era opens a new era as it can provide various elastic and scalable IT services in a pay-as-you-go fashion, in which its users can reduce the huge capital investments in their own IT infrastructure. This philosophy is users of cloud storage services no longer physically maintain direct control over their data that makes data security as one of the concern of cloud. The existing research work allows data integrity to be verified without possession of the actual data file. After this when the verification is done by a trusted third party, this verification process is also called data auditing, and this third party is called an auditor. Many results that are related to experimental and theoretical which demonstrate that our scheme can offer not only enhanced security ,flexibility, but also significantly lower overhead for big data applications with a large number of frequent small updates, like a applications in social media and business transactions.
In daily life of people, messengers or chatting applications provides ability for instantaneous m... more In daily life of people, messengers or chatting applications provides ability for instantaneous messaging over the internet. Exchange of messages takes place in unanimously used languages like English. where both the users identify how to communicate in a general language. Thus chatting on mobile phones is a comfort when both the parties concerned know a common language. Hence we are implementing application which is a Android based chatting application which makes cross language statement possible using mobile networking technology and programming. This application will facilitate the communication between two user irrespective of the language each user wishes to use independently. The variety of modes of communication accessible in this messenger is through text. Due to the best dispensation power provided among the accessible smart phones and elevated battery life we choose to work on Android platform. Thus we trying to implemented app that will connection the language barrier and enable simplicity of communication through this application.
Power transformers plays most crucial in power system. It transfers of power from one voltage lev... more Power transformers plays most crucial in power system. It transfers of power from one voltage level to another. Power transformer breakdown or damage may interrupt power transmission & distribution operation Under continuous operation confront to the electrical and thermal stresses. Fault in the power transformer may lead to the power supply interruption The Dissolved Gas Analysis to detect incipient faults has been improved using artificial neural networks and is compared with Rogers ratio method with available samples of field information. Dissolved gas analysis (DGA) is a reliable technique for detecting the presence of incipient fault conditions in oil immersed transformers. In this method the presence of certain key gases is monitored. The various analysis methods are : Rogers ratio, IEC ratio, Doernenburg, Duval triangle, key gas, artificial neural network (ANN) method. In this paper the various DGA methods are evaluated and compared. The key gases considered are hydrogen, methane, ethane, ethylene, acetylene.
Image annotation has been an active research issue in the recent years due to its potentially lar... more Image annotation has been an active research issue in the recent years due to its potentially large impact on both image understanding and image retrieval. Image annotation can be viewed as multi-label learning problem in most of the studies where each image could contain multiple objects and consequently could be associated with a set of labels. Large amount of research have been done on image retrieval in the past two decades. However, recently research is focusing to bridge the semantic gap between low level image features and high level semantic tags used to express image content. In this paper, we present acomprehensive survey on latest development in automatic image annotation and tag ranking.
Now days, Forensic analysis is doing crucial job in crime investigation. In earlier, it's common ... more Now days, Forensic analysis is doing crucial job in crime investigation. In earlier, it's common that use of digital devices like computers is used for the analysis of crimes by investigation officers. The crimes for investigation are embrace hacking, drug trafficking, erotica, numerous larceny crimes etc. In computer there are multiple files are present in order to process them for the investigation of explicit crime by investigation officer. So it's a difficult task for forensic examiner to do such analysis in quick period of time. That's why to do the forensic analysis of documents within short period of time requires special techniques to make such complex task in a simpler approach. Such special technique includes Document Clustering. Because clustering is the unverified organization of designs that is data items, remarks, or feature vectors into groups. Clustering algorithms such as K-Means, K-Medoids, Single Link, Complete Link and Average Link can simplify the detection of new and valuable information from the documents under investigation. This survey paper gives study of different existing Document clustering methods in accordance with computer forensic analysis. We also give comparative study of different computer forensic analysis techniques and enhance clustering algorithm which will improve accuracy of clustering to finding relevant documents from huge amount of data.
Data Mining is that the method of extracting hidden information, helpful trends and pattern from ... more Data Mining is that the method of extracting hidden information, helpful trends and pattern from giant databases that is employed in organization for decision-making purpose. There square measure varied data processing techniques like clump, classification, prediction, outlier analysis and association rule mining. Clump plays a vital role in data mining process. This paper focuses regarding clump techniques.There square measure many applications wherever clump technique is employed. Clump is that the method of assignment knowledge sets into completely different teams so knowledge sets in same cluster having similar behavior as compared to knowledge sets in alternative teams. This paper discusses regarding varied clump techniques. It conjointly describes regarding varied professionals and cons of those techniques. This paper conjointly focuses on comparative analysis of assorted clump techniques.
The appearance gap among sketches and photo-realistic images is a primary challenge in sketch-bas... more The appearance gap among sketches and photo-realistic images is a primary challenge in sketch-based image retrieval (SBIR) systems. The being of noisy edges on photo-realistic images is a key factor in then largement of the look gap and significantly degrades retrieval performance. To cross the gap, we propose a framework consisting of a latest queue segment-base descriptor called histogram of line relationship (HLR) and anew noise impact decline algorithm identified as object boundary selection. HLR treat sketches also extracted edges of photo-realistic images as a chain of piece-wise line segments and captures the link among them. Base on the HLR,the object boundary selection algorithm aims to decrease the impact of noisy edges by selecting the shaping edges that best match to the object boundaries. Multiple hypotheses are generated for descriptors via hypothetical edge choice. The selection algorithm is formulated to find the best grouping of hypotheses to exploit the retrieval score; a fast method is also planned. To reduce the disturbance of false matches in the scoring procedure, two constraints on spatial and coherent aspects are introduced. We experienced the HLR descriptor and the planned frame-work on public datasets and a new image dataset of three million images, which we recently collected for SBIR estimation purposes. We compared the proposed HLR with state-of-the-art descriptors (SHoG, GF-HOG). The experimental marks show that our HLR descriptor out performs them. Pooled with the object boundary selection algorithm, our framework significantly improves SBIR performance. Keywords-Large-scale sketch retrieval, line segment-based descriptor, objects boundary selection.
Now-a-days, a very large number of consumer reviews for various products are available on the int... more Now-a-days, a very large number of consumer reviews for various products are available on the internet. These reviews are helpful for knowing the quality of the products based on their aspects. The online consumer reviews are not only important for the consumers but also for the firms. The firms use these reviews as feedback from the consumers. In this paper, we have reviewed different techniques and approaches for mining the consumer opinions from the reviews of products. The consumer reviews are disorganized and it increases the difficulty level for information retrieval and knowledge acquisition. The product aspect ranking is very useful in identifying important aspects of products from consumer reviews. It improves usability of various product reviews. The aspect identification of products is based on two observations: 1) the important aspects of product are hugely commented in the reviews by a large number of online consumers; 2) the overall opinions on the product are greatly influenced by the consumer opinions on the important aspects. It is applied to two most popular real world applications such as extractive review summarization and document level sentiment classification.
Ranking misrepresentation in the versatile App market suggest to illegal or fake actions which ha... more Ranking misrepresentation in the versatile App market suggest to illegal or fake actions which have a reason for knocking up the Apps in the prominence list. Without a doubt, it turns out to be more regular for App designers to utilize shady means, for example, blowing up their Apps' deals or posting imposter App evaluations, to confer ranking extortion. Many of researchers were work on how to prevent the ranking misrepresentation but there is very less understanding and analysis on this topic. Here we discuss on, detail view of ranking extortion and ranking misrepresentation recognition framework for versatile Apps. In particular, it first study to precisely find the ranking extortion by mining the dynamic periods, to be specific leading sessions, of versatile Apps. Such leading sessions can be utilized for distinguishing the nearby peculiarity rather than worldwide irregularity of App rankings. Besides, in the brief discussion of topic it work on three sorts of confirmations 1) ranking based evidences,2) rating based evidences and 3) review based evidences, by demonstrating Apps' positioning, rating and survey practices through measurable theories tests. It is also work on an optimization based aggregation method to coordinate every one of the confirmations for misrepresentation location.
Personalized web search (PWS) useful in enhancing the quality of various search on the Internet.S... more Personalized web search (PWS) useful in enhancing the quality of various search on the Internet.Sometimes, the user has to revile the personal details while searching the required information.So,this is the main problem for the wide proliferation of PWS.We study privacy protection in PWS that manage the user preferences as hierarchical user profile.We propose the PWS framework called UPS that make generalize profile as per privacy requirement specified by user.Our runtime generalization use two predictive matrics that evaluate the utility of personalization and the privacy risk of exposing the generalized profile. We use two greedy algorithms,that are GreedyDPand GreedyIL,which is used for runtimegeneralization.We provide the online prediction mechanism for deciding whether to generalized or personalized the query.
Energy consumption, memory and throughput of nodes are crucial factor that constrains the network... more Energy consumption, memory and throughput of nodes are crucial factor that constrains the networks life time for Wireless Sensor Networks (WSNs). These are challenges to optimize communication amongst a group of spatially distributed sensor nodes in a WSN (Wireless Sensor Network). There are several traditional approaches to overcome these challenges such as to apply clustering techniques to effectively establish an ordered connection of sensor nodes whilst improving the overall network lifetime. In this paper an improved clustering based multicast approach is proposed that allows any cluster head to be a multicast source with an unlimited number of subscribers, to optimize group communication in WSNs whilst ensuring sensor nodes do not deprecate rapidly in energy levels. Also several clustering approaches are reviewed and multicast versus broadcast communication in WSNs is examined.
Many methods are used for information Filtering but in all previous process suffered from some ad... more Many methods are used for information Filtering but in all previous process suffered from some advantages and disadvantages .As per aim of information filtering to generate user interest document by removing unwanted document .existing model I.e. pattern based model, term based model suffered with polysemy and synonymy ,noise generated by this model . All this model only consider that user interested in in only one topic but in situation user are interested in at time many topic in the filled on information filtering .we are focused on maximum matched pattern to represent topic .this pattern consider user interest in many topic rather than single topic .so, this pattern can be efficiently and effectively used for represent and rank the document.
Online Social Network are today a standout amongst the most well known mediums for communication ... more Online Social Network are today a standout amongst the most well known mediums for communication between individuals to share information or assets. One fundamental issue in today On-line Social systems (OSNs) is to give clients the ability to control the messages posted all alone mystery space to avert that undesirable substance is uncovered. Up to now OSNs give little backing to this demand.ll the hole, in this task, we propose a framework permitting OSN clients to have an immediate control on the messages posted on their walls. Intending to create standards which can square client posts over interpersonal organizations the individuals who have obscene substance or misuse dialect furthermore permitting to piece graphical pictures post which are manhandled by applying separating rules. A the truth is taken that in Online Social Networks there is the likelihood of setting pictures or setting up content on open or private districts, more often than not called walls. Data sifting can in this manner be utilized to give clients the capacity to consequently control the messages composed and picture on their private walls, by separating superfluous posts. This is expert through a flexible standard based course of action, that allows clients to tweak the filtering criteria to be utilized to their walls, and a Machine Learning based delicate classifier consequently marking message in backing of substance based filtering.
Internet of Things combines with Radio Frequency Identification which enabling that wireless use ... more Internet of Things combines with Radio Frequency Identification which enabling that wireless use of electromagnetic fields to transfer data, by achieving the purposes of automatically identifying and tracking tags attached to objects.Healthcare applications in general collect patient information, which can be process, analyzed and wirelessly communicated to a healthcare gateway device which eventually ships the particular information to a head-end server over the cloud. The healthcare industry presents large amount of opportunities to put some of above technologies to use, such as Outpatient monitoring, Clinical care, remote monitoring or Rural Healthcare or Doctor on call.This paper gives an introduction to healthcare IoT system that can be shows that how the IoT System related to health,emergency challenges and outlook on possible solutions.
Internet of Things (IoT) is a concept that in future all objects around us as part of internet. I... more Internet of Things (IoT) is a concept that in future all objects around us as part of internet. IoT coverage is very wide area. It includes variety of objects like smart phones, tablets, digital cameras and sensors. It is a system that uses mobile to control home electrical devices and features automatically through internet from anywhere around the world. It can be automated home which is known as smart home. It is meant to save the electric power and human energy. By using Wi-Fi technology, home electrical devices can be controlled which reduces energy and efficient controlling from specific distance. Wi-Fi helps to connect to the things around us so that we can access anything at any time and any place. Mobile App is also used to controls the electrical devices of home. It is meant to save the electric power and human energy. Thus home automation system differs from other system by allowing the user to operate the system from anywhere with low cost and expandable allowing a variety of devices to be controlled.
Uploads
Volume 1 Issue 1 by Editor IJMTER
assessment largely based on cell anatomy and tissue distribution. However in many
instances it is subjective and often leads to considerable variability. Whereas computer
diagnostic tools enable objective judgments by making use of quantitative measures.
Paper presents a diagnosis system based on an adaptive Neuro-fuzzy inference system
for effective classification of Bascal cell carcinoma images from the given set of all types
of skin lesions. System divide in three parts. Image Processing, Feature Extraction, and
classification. First part deals with the noise reduction and artifacts removing from the
set of images. Second part deals with extracting variety of features of Bascal Cell
Carcinoma using the Greedy feature flip algorithm (G-flip), and classification method
using ANFIS algorithm and finally Part three deals with the results that is classification
of BCC images from the variety of pre-cancerous stage images that is Actinic Keratosis
and also other images called psoriasis which looks as cancer images at a first look . The
results confirmed that the proposed ANFIS model has potential in classifying the skin
cancer diagnosis.
web applications continues to grow, these systems enter a critical role in a multitude of companies.
The way web systems impact business aspects, combined with an ever-growing internet user mass,
emphasize the importance of developing high-quality products. Thus, proper testing plays a distinctive
part in ensuring reliable, robust and high performing operation of web applications. Issues such as the
security of the web application, the basic functionality of the site, its accessibility to handicapped users
and fully able users, as well as readiness for expected traffic and number of users and the ability to
survive a massive spike in user traffic, both of which are related to load testing. The testing of web
based applications has much in common with the testing of desktop systems like testing of
functionality, configuration, and compatibility. Web application testing consists of the analysis of the
web fault compared to the generic software faults. Other faults are strictly dependent on the interaction
mode because of web application multi-tier architecture. Some web specific faults are authentication
problem, incorrect multi language support, hyperlink problem, cross-browser portability problem,
incorrect form construction, incorrect cookie value, incorrect session management, incorrect
generation of error page, etc.
of psychology which is concerned with providing a systematic account of the ways by which we can
differentiate one-another. Individuals differ from one another in a variety of ways: their anatomical and
physiognomic characteristics, their personal appearance, grooming, manner of dress, their social
backgrounds, roles and other demographic characteristics, their effect on others or social stimulus value
and their temporary states, moods, attitudes and activities at any given moment in time. Since human
tendencies are largely dependent on environmental and situational consistencies. In proposed work we
study various researches has been done to identify the trait of author’s.
conditions is a difficult task. Facial expression recognition with different age variations is
considered in this study. This paper emphasizes on recognition of facial expression like
happiness mood of nine persons using subspace methods. This paper mainly focuses on new
robust subspace method which is based on Proposed Euclidean Distance Score Level Fusion
(PEDSLF) using PCA, ICA, SVD methods. All these methods and new robust method is
tested with FGNET database. An automatic recognition of facial expressions is being carried
out. Comparative analysis results surpluses PEDSLF method is more accurate for happiness
emotional facial expression recognition.
optimization is taken as a part to provide a security. Key Goal of this paper is to take
fragmentation as part of security. This is a survey on techniques which can used to provide
security through fragmentation.
undesirable to prevent the content leakage on the trusted video delivery and piracy prevention has
been indeed, become critical. In order to preserve content leakage and prevent piracy,
conventional system addressed the issue by proposing the method based on the observation of
streamed traffic throughout the network .Also piracy has hindered the use of open peer to peer
networks for commercial content delivery. Hence the basic idea is to propose an enhanced
dynamic content leakage detection scheme that is robust to the variation of the video lengths. It
enhances the detection performances even in the environment subjected to variation in length of
videos. To detect pirates, identity-based signature and time stamped token have been generated. It
helps to solve piracy without affecting P2P clients so that colluder cannot download the secured
videos. The advantage lies mainly in advanced content availability, low cost and copyright
agreements in protecting the secured videos.
analyses, pattern have to be extracted from that to gain knowledge. In this new period with
rumble of data both ordered and unordered, by using traditional databases and architectures, it
has become difficult to process, manage and analyses patterns. To gain knowledge about the
Big Data a proper architecture should be understood. Classification is an important data mining
technique with broad applications to classify the various kinds of data used in nearly every
field of our life. Classification is used to classify the item according to the features of the item
with respect to the predefined set of classes. This paper provides an inclusive survey of
different classification algorithms and put a light on various classification algorithms including
j48, C4.5, k-nearest neighbor classifier, Naive Bayes, SVM etc., using random concept.
technique. Paper is based how RFID sensors can sense data. Generally these data needs to be send
through some wired or wireless medium, But sometime we need to send such data in groupings or
indications directly from Sensors.
authenticity are two major problems in handling digital multimedia. Watermarking is a very
important field for copyrights of various electronic documents and media. A variety of
techniques have been proposed for copyright protection of digital images which include
spatial domain and transform domain watermarking. This paper aims to provide some basic
concepts of digital image watermarking techniques and comparisons between them.
application of data mining techniques to agriculture. Recent technologies are nowadays able to
provide a lot of information on agricultural-related activities, which can then be analyzed in
order to find important information. A related, but not equivalent term is precision agriculture.
This research aimed to assess the various classification techniques of data mining and apply
them to a soil science database to establish if meaningful relationships can be found. A large
data set of soil database is extracted from the Soil Science & Agricultural department, Bhopal
M.P and National Informatics Centre, The application of data mining techniques has never
been conducted for Bhopal soil data sets. The research compares the different classifiers and
the outcome of this research could improve the management and systems of soil uses
throughout a large number of fields that include agriculture, horticulture, environmental and
land use management.
ad hoc networks, there is no centralized infrastructure to monitor or allocate the resources
used by the mobile nodes. The absence of any central coordinator makes the routing a
complex one compared to cellular networks. The Ad hoc On Demand Distance Vector
(AODV) routing algorithm is a routing protocol designed for ad hoc mobile devices. AODV
uses an on demand approach for finding routes .A class of routing protocols called ondemand protocols has recently found attention because of their low routing overhead. The ondemand protocols depend on query floods to discover routes whenever a new route is needed.
Such floods take up a substantial portion of network bandwidth. The routing in Mobile ad hoc
network is difficult and number of reactive routing protocols like AODV, DSR, and DSDV
has been implemented. In this paper, an attempt has been made to thoroughly study all
AODVs and a new AODV is proposed called AR-AODV
management system. This element is responsible for translating a user-submitted query
commonly written in a non-procedural language-into an efficient query evaluation program that
can be executed against the database. This research paper describes architecture steps of query
process and optimization time and memory usage. Key goal of this paper is to understand the
basic query optimization process and its architecture.
many computer graphics/vision applications as it reduces the visibility of the scene. Air light and
attenuation are two basic phenomena of haze. air light enhances the whiteness in scene and on
the other hand attenuation reduces the contrast. the colour and contrast of the scene is recovered
by haze removal techniques. many applications like object detection , surveillance, consumer
electronics etc. apply haze removal techniques. this paper widely focuses on the methods of
effectively eliminating haze from digital images. it also indicates the demerits of current
techniques.
the late 1980’s. Object DBMSs add database functionality to object programming languages. A
major benefit of this approach is the unification of the application and database development into
a seamless data model and language environment. As a result, applications require less code, use
more natural data modeling,
nanoelectronics. However, it is very difficult to obtain fibers of conducting polymers like polyaniline
(PANI) and polypyrrole.Hence they are invariably mixed with other insulating polymers such as
polymethylmethacrylate (PMMA) to obtain a conducting composite depending on the percolation of the
conducting polymer. Here, we report the preparation of PANI-PMMA composite films by chemical
deposition method polymer fibers are investigated atroom temperature with different concentrations of
PANI (0.05M,0.1M,0.2M,0.5M,). It is observed that there is a significant enhancement in the
conductivity of these fibers with the increase in the concentration of PANI. Here to study the D.C.
conductivity, SEM, FTIR and Gas detecting properties of films.
the globe, particularly in current scenario among users with low-end mobile devices. So, Web exploration is
one of the main and valuable of these applications. SMS-supported web exploration takes unstructured
queries and returns web snippets via SMS. This application permits users with low-priced mobile phones
and no data plan to hit into the enormous amount of information on the web. SMS-supported web
exploration is a demanding problem since the channel is both tremendously constricted and costly.
Preferably, together the query and the response fit within an SMS. This paper presents the blueprint and
execution of SMS Search, an SMS-supported search application that facilitates users to achieve
exceptionally crisp (one SMS message) search responses for queries across random keywords in one round
of communication. SMS Search is intended to balance existing SMS-supported search applications that are
moreover inadequate in the keywords they identify or engage an individual in the loop.
communication over the internet protocol.SIP is however designed with open structure vulnerable
to security attak.The SIP flooding attack is the most severe attack becouse it is easy to launch and
capable of quickly draining the resources of both network and node. The existing flooding
detection schemes are either anomaly based or misuse based.The anomaly based scheme can detect
unknown attack it does not need the proir knowledge of the attack,but it generates some false
alarm,suffers from accuracy problem and gives false positive.Similarly the misuse based schemes
have high detection accuracy,no false positive but it cannot detect unknown attack.To overcome
problems in both detection schemes a hybrid detection scheme is proposed.the proposed hybrid
scheme consist features of both anomaly based scheme and misuse based scheme,and it gives fast
response,increase accuracy of detection and no false alarm
impacts. This paper proposes an Asymmetrical Thirteen level H-Bridge inverter circuit. Two inputs
from solar PV panels are given to the converter and maximum power is extracted by using maximum
power point tracking method. Integrated converter is DC to DC Boost converter. The output is given
to H- inverter which converts dc to ac and the thirteen level output voltage is applied to the load.
Operational analysis and simulation results are given for the proposed circuit.
user-generated content in Hindi on the internet is the motivation behind the sentiment analysis
Research that is growing up at a lightning speed. This information can prove to be very useful for
researchers, governments and organization to learn what’s on public mind, to make sound decisions.
Opinion Mining or Sentiment Analysis is a natural language processing task that mine information
from various text forms such as reviews, news, and blogs and classify them on the basis of their
polarity as positive, negative or neutral. But, from the last few years, enormous increase has been seen
in Hindi language on the Web. Research in opinion mining mostly carried out in English language
but it is very important to perform the opinion mining in Hindi language also as large amount
of information in Hindi is also available on the Web. This paper gives an overview of the work that
has been done Hindi language.
stronger privacy protection. A number of schemes have been proposed to protect privacy in
Ad hoc networks. However, none of these schemes offer unobservability property since data
packets and control packets are still linkable and distinguishable in these schemes. In this
paper, we define stronger privacy requirements regarding privacy preserving routing in
mobile ad hoc networks. Then we propose an Unobservable Secure Routing scheme (USOR)
to offer complete unlinkability and content unobservability for all types of packets. USOR is
efficient as it uses a novel combination of group signature and ID-based encryption for route
discovery. Security analysis demonstrates that USOR can well protect user privacy against
both inside and outside attackers. We implement USOR on Network Security (NS2), and
evaluate its performance by comparing with Ad Hoc On demand Distance Vector Routing
(AODV) and MASK. The simulation results show that USOR not only has satisfactory
performance compared to AODV, but also achieves stronger privacy protection than existing
schemes like Mask.
basic energy source to the mankind. It is also one of largest energy source to the mankind. PV
systems are a relative new technology. The operational experience with PV systems itself is at an
acceptable high level and today’s installed PV systems are of a good quality and are able to operate
without any problems for many years. The price level of the PV modules and the Balance of System
costs (inverter included) have decreased significantly. This energy is available all around the world
in large quantity. When this energy is collected by the solar PV cells it is in the small power with the
D.C. supply, which is not compatible with the existing grid in the world. There is an inverter and the
converter stage comes before this energy can used. Grid interactive PV systems can vary
substantially in size.
transactions, thus providing additional security measures is necessary. According to consumer’s
expectations, these security measures must be cheap, reliable and un-intrusive to the authorized
person. The technique which meets these requirements is handwritten signature verification.
Signature verification technique has advantage over other biometric techniques: including voice, iris,
fingerprint, palm etc. as it is mostly used for daily routine procedures link banking operations,
document analysis, electronic funds transfer, access control and many other. Most importantly, it’s
easy and people are less likely to object it. Proposed technique involves using a new approach that
depends on a neural network which enables the user to recognize whether a signature is original or a
fraud. Scanned images are introduced into the computer, their quality is modified with the help of
image enhancement and noise reduction techniques, specific features are extracted and neural
network is trained, The different stages of the process includes: image pre-processing, feature
extraction and pattern recognition through neural networks.
Power consumption can be reduced by substituting some flip-flops with less multi-bit flip-flops.
Multi-bit flip-flops are one of the strategies for reducing the clock power consumption. This project
concentrates on reducing of clock force utilizing multi-bit flip-flops by clock synchronization.
Diminishment of the clock power consumption with two single bit flip-flops are synchronized with
single clock pulse. Uniting single bit flip-flops into one multi-bit flip-flop evades duplicate inverters,
brings down the aggregate clock power utilization which lessens the total area. A mixture table is
fabricated to acquire a multi-bit flip-flop which can store the flip-flops that can be consolidated. This
task concentrates on D flip-flop which builds the loading of the clock signal. QCL adder is utilized as
an application for multi-bit flip-flop. Highest ‘1’ bit finding algorithm is utilized to discover the
highest 1 bit from the yield of QCL adder. This calculation checks the yield of QCL adder in each
one cycle.
scene and keeping track of its motion, positioning and occlusion. There are the three steps of video
object tracking system those are object detection, object classification and object tracking. Object
detection is performed to check existence of objects in video. Then the detected object can be
classified in various categories on the basis on their shape, motion, color and texture. Object tracking
is performed using monitoring object changes. This paper we are going to take overview of different
object detection, object classification and object tracking techniques and also the comparison of
different techniques used for various stages of tracking.
association research. In object oriented database the access scope of query against a class in general
includes not only the class but also all subclass of the class. This means that to support the evaluation of
a query, the system must maintain one index on an attribute for each classes involve in query.
misoperation of customer or end user equipment. The new method of dynamic voltage restorer with
SMES is proposed to protect consumers load from tripping .The DVR can effectively inject the
voltage to the power lines. To improve the performance of DVR the superconducting magnet is
selected as the energy storage unit. The compensation of the voltage sag, swell by short period of
voltage injection. SMES based DVR has been used to improve the performance of power system. It
is having high power rating with maximum efficiency than any other energy storage devices. It
restores line voltage to its nominal value within few milliseconds. Most of the power quality
problems are voltage sag, swell, interruption, transient, fluctuation, etc.. Among those power quality
problem voltage sag is severe one. So it is analyzing and mitigated using custom power device and
SMES system using MATLAB SIMULINK in proposed system.
in images that are already associated. This paper extracts high-level skin texture in order to find out
an efficient image differencing method for the analysis of Brain Tumor. On the other hand, this
produces sets of skin texture that are both spatial. We demonstrate a technique that avoids arbitrary
spatial constraints and is robust in the presence of sound, outliers, and imaging artifact, while
outperforming even profitable products in the analysis of Brain Tumor images. First, the landmark
are establish, and then the top entrant are sorted into a end set. Second, the top sets of the two
descriptions are then differenced through a cluster judgment. The symmetry of the human body is
utilized to increase the accuracy of the finding. We imitate this technique in an effort to understand
and ultimately capture the judgment of the radiologist. The image differencing with clustered
contrast process determines the being there of Brain Tumor. Using the most favorable features
extracted from normal and tumor regions of MRI by using arithmetical features, classifiers are used
to categorize and segment the tumor portion in irregular images. Both the difficult and preparation
phase gives the proportion of accuracy on each parameter in neural networks, which gives the idea to
decide the best one to be used in supplementary works. The results showed outperformance of
algorithm when compared with classification accuracy which works as shows potential tool for
classification and requires extension in brain tumor analysis.
deregulated power systems is presented. Restructuring and deregulation has exposed transmission
planner to new objectives and uncertainties. Therefore, new criteria and approaches are needed for
transmission planning in deregulated environments. In this paper we introduced a new method for
computing the Locational Marginal Prices and new market-based criteria for transmission expansion
planning in deregulated environments. The presented approach is applied to Southern Region (SR)
48-bus Indian System by using scenario technique EXPECTED COST CRITERION.
output of 0.85-0.5 v with 90-nm CMOS technology. Current splitting technique used to boost the
gain by using an error amplifier. A power noise cancellation mechanism is formed in the rail-to-rail
output stage of the error amplifier, to minimize the size of power MOS transistor. In this paper we
achieve a fast transient response, high power supply rejection, low dropout regulator, low voltage,
and small area. CMOS processes have been used in Large scale integrated circuits like LSI and
microprocessor they have been miniaturized constantly. Taking full advantage of the miniaturization
technology, CMOS linear regulators have become the power management ICs that are widely used in
portable electronics products to realize low profile, low dropout, and low supply current.
information. which frequently updating its location information within its neighboring region, which is
called neighborhood update (NU).and updating its location information to certain distributed location server
in the network, which is called location server update (LSU). The operation costs in location updates and the
performance losses of the target application due to location application costs which imposes question for
nodes to decide the optimal strategy to update the location information, where the optimality is used for
minimizing the costs. The location update decision problem is modeled as a Markov Decision Process
(MDP). The monotonicity properties of optimal NU and LSU operations with respect to location application
cost under a general cost setting. Separable cost structure shows the location update decisions of NU and
LSU. Which can be independently carried out without loss of optimality that is a separation property. From
the separation property of the problem structure and the monotonicity properties of optimal actions which
finds that 1) there always exists a simple optimal update rule for LSU operations 2) for NU operations. If no
prior knowledge of the MDP model is available
through a porous slab bounded between two fixed permeable parallel plates is examined. There is a
constant injection at one plate and equal suction at the other plate. The two plates are kept at two
different temperatures and the flow is generated by a constant pressure gradient. The solutions of
governing equations of the flow with appropriate boundary conditions have been obtained
analytically.
choosing the best database plan. It is unlike preceding query optimization techniques that uses only a
single approach for identifying best query plan by extracting data from database. Our approach takes
into account various phases of query processing and optimization, heuristic estimation techniques
and cost function for identifying the best execution plan. A review report on various phases of query
processing, goals of optimizer, various rules for heuristic optimization and cost components involved
are presented in this paper.
functionality of the four port router for network on chip using the latest verification methodologies,
Hardware Verification Languages and EDA tools and qualify the IP for synthesis and implementation.
This Router design contains three output ports and one input port, it is packet based Protocol. This Design
consists Registers and FIFO. For larger networks, where a direct-mapped approach is not feasible due to
FPGA resource limitations, a virtualized time multiplexed approach was used. Compared to the provided
software reference implementation, our direct-mapped approach achieves three orders of magnitude
speedup, while our virtualized time multiplexed approach achieves one to two orders of magnitude
speedup, depending on the network and router configuration.
duplicate IP address to hack the confidential data. Hop by hop authentication is necessary for secured
communication to prevent such confidentiality Multi hop routing in Wireless sensor networks
(WSNs) offers little protection against the identity deception through replaying routing information.
This defect may take a chance of an adversary to misdirect significant network traffic, resulting in
disastrous consequences attacks against the routing protocols including Sinkhole, Worm hole and
Sybil attacks. The situation is further aggravated by mobile & harsh network condition. It cannot be
solved by traditional encryption or authentication techniques or efforts at developing trust aware
routing protocols do not effectively address this severe problem. Secure the WSNs against
adversaries misdirecting the multi-hop routing. So proposed a method is “Trusted Routing Path
Selection in WSNs through TARF”, a robust trust-aware routing framework for dynamic WSNs.
Without tight time synchronization or known geographic information. TARF provides trustworthy,
secure, time efficient & energy efficient route. Most importantly TARF proves effective against
those harmful attacks developed out of identity deception; the resilience of TARF is verified through
extensive evaluation with both simulation and empirical experiments on large scale WSNs under
various scenarios.
data management systems from local sites to commercial public cloud for great flexibility and
economic savings. But for protecting data privacy, sensitive data has to be encrypted before
outsourcing.Considering the large number of data users and documents in cloud, it is crucial for
the search service to allow multi-keyword query and provide result similarity ranking to meet the
effective data retrieval need. Related works on searchable encryption focus on single keyword
search or Boolean keyword search, and rarely differentiate the search results. We first propose a
basic MRSE scheme using secure inner product computation, and then significantly improve it to
meet different privacy requirements in two levels of threat models. The Incremental High Utility
Pattern Transaction Frequency Tree (IHUPTF-Tree) is designed according to the transaction
frequency (descending order) of items to obtain a compact tree.
By using high utility pattern the items can be arranged in an efficient manner. Tree structure
is used to sort the items. Thus the items are sorted and frequent pattern is obtained. The frequent
pattern items are retrieved from the database by using hybrid tree (H-Tree) structure. So the
execution time becomes faster. Finally, the frequent pattern item that satisfies the threshold value
is displayed.
of decryption rights for any set of cipher texts. One can aggregate any set of secret keys and make
them as compact as a single key. The secret key holder can release a constant-size aggregate key for
flexible choices of cipher text set in cloud storage. In KAC, users encrypt a message not only under a
public-key, but also under an identifier of cipher text called class. That means the cipher texts are
further categorized into different classes. The key owner holds a master-secret called master-secret
key, which can be used to extract secret keys for different classes. More importantly, the extracted
key have can be an aggregate key which is as compact as a secret key for a single class, but
aggregates the power of many such keys, i.e., the decryption power for any subset of cipher text
classes. The key aggregate cryptosystem is enhanced with boundary less cipher text classes. The
system is improved with device independent key distribution mechanism. The key distribution
process is enhanced with security features to protect key leakage. The key parameter transmission
process is integrated with the cipher text download process.
network. In mobile ad-hoc network packet sending and receiving is most important things. When
packets send to one node to another node this sending node reach to destination at this time if packets
are drop so because of this reason security is not maintained. Packets drop using malicious node or
not and also find the detection of packet drop. In this paper various security techniques in packet
dropping attack for the security purpose.
Using traditional databases and architectures, it has become difficult to process, manage and analyze
patterns. To gain knowledge about the Big Data a proper architecture should be understood.
Classification is an important data mining technique with broad applications to classify the various
kinds of data used in nearly every field of our life. Classification is used to classify the item
according to the features of the item with respect to the predefined set of classes. This paper put a
light on various classification algorithms including j48, C4.5, Naive Bayes using large dataset.
phrases, or the definitions of words. This paper describe the many challenges inherent in building a
reverse lexicon, and map drawback to the well known abstract similarity problem The criterion web
search engines are basic versions of system; they take benefit of huge scale which permits inferring
general interest concerning documents from link information. This paper describe the basic study of
database driven reverse dictionary using three large-scale dataset namely person names, general English
words and biomedical concepts. This paper analyzes difficulties arising in the use of documents
produced by Reverse dictionary.
medical images, segmentation is done for: studying the anatomical structure, identifying ROI ie tumor
or any other abnormalities, identifying the increase in tissue volume in a region, treatment planning.
Currently there are many different algorithms available for image segmentation. This paper lists and
compares some of them. Each has their own advantages and limitations.
potential to revolutionize many segments. The Wireless Sensor Network (WSN) is made up of a
collection of sensor nodes, which were small energy constrained devices. Routing technique is one of
the research area in wireless sensor network. So by designing an efficient routing protocol for
reducing energy consumption is the important factor. In this paper, a brief introduction to routing
challenges in WSN have been mentioned. This paper also provides the basic classification of routing
protocols in WSNs along with the most energy efficient protocol named LEACH along with its
advantages and disadvantages. This paper also focus on some of the improved version of LEACH
protocol.
motors involved in the process. Rolling of cloth should be synchronized with the speed of weaving
spindle to avoid damage and motor speed synchronization is vital in conveyor belt driven by multiple
motors. Abrupt load variations may cause hunting or oscillatory behavior in d. c. machines. This
behavior can be detrimental to the process. The digitally controlled d. c. machines can have much
aggravated phenomenon owing to poor sampling period selection. Traditionally processes are
synchronized through mechanical transmission system consisting of a line shaft gears, pullers etc.
Among the available software mechanisms master-slave synchronization is a widely used technique.
Multi-motor applications have become very attractive field in industrial applications
replacing in traditional mechanical coupling. Many textile applications involved synchronized speed
motors. For example wrapping of clothes should be synchronized with the speed of weaving spindle
to avoid damage and Similarly in some cases the speed of long conveyor belt driven by multiple
motors is need to be constant. In such types of applications master slave technique is used as a
software mechanism to synchronize the speed of different motors to avoid damage.
households, eateries, restaurants, hotels and clubs and industrial applications. In this paper an automatic
system to improve the production rate, increase the quality of product and to address the labor problems in
the plant is presented. This proposed model is designed to control and monitor the processes of
manufacturing jaggery product. The main controlling console of the automatic jaggery production system
is microcontroller (Atmel). Which read the sensors (LM35) measure physical quantity temperature of the
boiling sugarcane juice to monitor the temperature to get good quality molten jaggery. The movement of the
pan is controlled by DC motor and switches through the programming the microcontroller, along with the
opening of control valve for the easy flow of molten jaggery(pug) into the cooling pit to get the required
shape of jaggery to reduce the labor work.
power, unique functional detector nodes for the wireless communication. Sensing applications have
normal conjointly as a reality of result. These embrace environmental observation, intrusion detection,
battleground police work, and so on. In a very wireless detector network (WSN), the way the restricted
power resources of sensors to increase that to conserve the network lifespan of the WSN as long as double
whereas activity the sensing and detected knowledge news tasks, is that the most important issue within
the network style. In a WSN, detector nodes deliver detected knowledge back to the sink as multi hopping.
The detector nodes are very close to the sink can usually consume additional battery power than others;
consequently, these nodes will been drain out their battery energy quickly and short the network lifespan
of the WSN. Sink relocation have associate degree economical network lifespan extension methodology,
that could avoids an excessive amount of battery energy for a selected cluster of detector nodes. during
this paper, we have a tendency to propose a moving strategy known as energy-aware sink relocation
(EASR) for mobile sinks in WSN. These projected mechanism uses info associated with the residual
battery energy of detector nodes to be adaptively alter the transmission vary of detector nodes and
therefore the relocating theme to the sink. Some theoretical and numerical analyze area unit given to point
out that the EASR methodology will extend the network lifespan of the WSN considerably
for useful information on the web. However, users might experience failure when search engines return
irrelevant results that do not meet their real intentions. Such irrelevance is largely due to the enormous
variety of users’ contexts and backgrounds, as well as the ambiguity of texts. Personalized web search
(PWS) is a general category of search techniques aiming at providing better search results, which are
tailored for individual user needs. As the expense, user information has to be collected and analyzed to
figure out the user intention behind the issued query. The solutions to PWS can generally be categorized into
two types, namely click-log-based methods and profile-based ones. The click-log based methods are
straightforward— they simply impose bias to clicked pages in the user’s query history. Although this
strategy has been demonstrated to perform consistently and considerably well, it can only work on repeated
queries from the same user, which is a strong limitation confining its applicability. In contrast, profile-based
methods improve the search experience with complicated user-interest models generated from user profiling
techniques. Profile-based methods can be potentially effective for almost all sorts of queries, but are
reported to be unstable under some circumstances.
confidentiality and integrity methods. Privacy model hides the individual identity over the public data values.
Sensitive attributes are protected using anonymity methods. Two or more parties have their own private data under
the distributed environment. The parties can collaborate to calculate any function on the union of their data. Secure
Multiparty Computation (SMC) protocols are used in privacy preserving data mining in distributed environments.
Association rule mining techniques are used to fetch frequent patterns.Apriori algorithm is used to mine association
rules in databases. Homogeneous databases share the same schema but hold information on different entities.
Horizontal partition refers the collection of homogeneous databases that are maintained in different parties. Fast
Distributed Mining (FDM) algorithm is an unsecured distributed version of the Apriori algorithm. Kantarcioglu
and Clifton protocol is used for secure mining of association rules in horizontally distributed databases. Unifying
lists of locally Frequent Itemsets Kantarcioglu and Clifton (UniFI-KC) protocol is used for the rule mining process
in partitioned database environment. UniFI-KC protocol is enhanced in two methods for security enhancement.
Secure computation of threshold function algorithm is used to compute the union of private subsets in each of the
interacting players. Set inclusion computation algorithm is used to test the inclusion of an element held by one
player in a subset held by another.The system is improved to support secure rule mining under vertical partitioned
database environment. The subgroup discovery process is adapted for partitioned database environment. The
system can be improved to support generalized association rule mining process. The system is enhanced to control
security leakages in the rule mining process.
system& biomedical applications. By using CORDIC algorithm we can able to reducing the number of
iteration to process the image in the system. Low power design is to be challenging process during system
operations. Previous approaches scope to minimize the power consumption without image quality
consideration. In this paper CORDIC Based Low Power DCT iterations process equally based upon their
image quality. An hardware implementation of ROM & control logic circuit to require large hardware
space in this system. Look-ahead CORDIC Approach is used to finish the iteration at one time. When
reducing hardware area & reducing number of iterations for maximize battery lifetime. This idea used to
achieve the low power design of image and video compression application
and more challenging dimension. Quality has evolved and undergone transformation from the inspection era to
the quality control regime and then to quality management and finally to the present TQM approach. At every
stage of the transformation “Quality” has been attaining wider dimension with respect to Customer focus,
continual improvement and has been evolving for addressing increasing demands of customers with respect to
delivery of products and services. Quality evaluation in software industry has been a debatable issue with various
models emerging to measure the product or service quality. Quality empowered business requires in depth
understanding of business goals and objectives. For any industry, whether manufacturing or software invention,
it is seen that the measurements of quality are driven by business needs which in turn are driven by market or
customer needs. Organizational objectives hold a key factor for measuring quality. The integrated Business
system has shareholders, employees, customer and supplier in any value chain. Value chain fundamentals would
be the primary focus and value chain components primarily decide the organizational business goals and
objectives. A good measurement system for performance evaluation cannot be empirically drawn from
mathematical model but by solid value chain understanding, assessment of needs of customers/end–users,
shareholders, suppliers and bottom-line identification.
years. Several researches are underway on peer-to-peer communication technologies, but no definitive conclusion
is currently available. Comparing to traditional server client technology on the Internet, the peer-to-peer
technology has capabilities to realize highly scalable, extensible and efficient distributed applications. Our work
presents an anonymous peer-to-peer (P2P) messaging system. A P2P network consists of a large number of peers
interconnected together to share all kinds of digital content. A key weakness of most existing P2P systems is the
lack of anonymity. Without anonymity, it is possible for third parties to identify the participants involved. First,
anonymous P2P system should make it impossible for third parties to identify the participants involved. Second,
anonymous P2P system should guarantee that only the content receiver knows the content. Third, anonymous P2P
system should allow the content publisher to plausibly deny that the content originated from him or her.
Keywords – peer to peer service, P2P system, client-server model.
machining operations, achieving desired surface quality features of the machined product, is really a
challenging job on cnc machine. Because, these quality features are highly correlated and are
expected to be influenced directly or indirectly by the direct effect of process parameters. There are a
number of parameters like cutting speed, feed and depth of cut etc. which must be given
consideration during the machining of SS 316L. The prediction of optimal machining conditions for
good surface roughness and material removal rate plays a very important role in process planning.
Diabetic Retinopathy.The damage to the retina of human eye caused by the complication of
increase in blood glucose level consequently leading to blindness is termed as Diabetic retinopathy.
The longer the patient has diabetes the higher the chance of developing diabetic retinopathy [1].No
specific symptoms are seen in DR patients until the illness is at the final stage. Thus , prior detection
and timely treatment has to be ensured. Dark lesions such as Microaneurysms and Hemorrhages or
bright lesions like Exudates are the visible symotoms of Diabetic Retinopathy [3]. Microaneurysms
are reddish in color with a diameter less than 125 µm,which turn into hemorrhages at a later stage
[6]. Conventionally , An ophthalmologist visualizes the blood vessels of the patient’s brain using an
ophthalmoscope . This method is often time consuming and requires fluorescein angiograms for
precise diagnosis. Moreover , it also requires highly trained and skilled clinicians to perform the DR
severity grading technique. Tis paper presents a low cost retinal algorithm for detecting
microaneurysms and hemorrhages which will assist opthalmologists across the globe in timely
detection of diabetic retinopathy.
building automation technology has a significant contribution to this advancements. The concept of
automation binds the object around, to the Ethernet to promote the remote access and control. This
paper presents controlling of electrical appliances over the Ethernet and enable interaction and
control of the device from computer .A web server is developed using LM3S8962 stellaris family
microcontroller with inbuilt Ethernet controller and Light weight Internet Protocol (LwIP) stack is
ported, to promote the control of electrical appliance. Graphical User Interface (GUI) developed
using web technologies provide an effective user interface. The functionality of the proposed system
is checked and verified.
beats too slowly or when there is irregularity in the beating or there is blockage. Artificial cardiac
pacemaker is a medical device that uses electrical impulses delivered by electrodes contracting the
heart muscles in order to regulate the beating of the heart. This paper presents an integrated fail safe
pacemaker which will produces the artificial pulse whenever the missed pulse in being produced by
the heart. Unit will function as an advanced dual chamber pacemaker type that can pace both atrium
and ventricle and hence functions like a normal heart. The device will monitor the electrical activities
of the heart through the cardiac signal ECG of the patient. Device status and vital parameters can be
monitored using Bluetooth wireless communication and displayed on an Android Platform.
penetration of renewable energy and distributed generation sources. The increased complexity
requires new methods to quickly manage the changing sources and loads. This research focuses
on one of such technologies, called the SST. A SST uses power electronic devices to achieve
voltage conversion from one level to another. Several SST topologies have been proposed by
different research groups, without a clear idea on which is most suited for grid applications.To
ensure a proper choice of topology,a separate literature review is presented in this paper. The final
choice of topology is extremely modular. In this, conventional dc-dc converter of solid state
transformer is replaced by SEPIC converter and the analysis is done using PMSG.
Computing and Software Engineering. As the advance of cloud technology and services, more
research work must be done to address the open issues and challenges in cloud security testing and
More innovative testing techniques and solutions, Although there are many published papers
discussing cloud Security testing, there is a lack of research papers addressing new issues,
challenges, and needs in Software Security Testing. however, there is no clear methodology to
follow in order to complete a cloud security testing. Since there is an increasing demand in Software
usage there is more in for Software Security Testing. This paper presents an overview of Cloud
Computing, Cloud security testing and comprehensive survey of security Testing Techniques and
methods. from this we have identified problems in the current security testing techniques. This work
has to presents a roadmap for new testers on the cloud with the necessary information to start their
test.
industries. Power dissipation has becoming an important consideration as performance and area for
VLSI Chip design. With reducing the chip size, reduced power consumption and power management
on chip are the key challenges due to increased complexity. Low power chip requirement in the
VLSI industry is main considerable field due to the reduction of chip dimension day by day and
environmental factors. For many designs, optimization of power is important as timing due to the
need to reduce package cost and extended battery life. This paper present various techniques to
reduce the power requirement in various stages of CMOS designing i.e. Dynamic Power
Suppression, Adiabatic Circuits, Logic Design for Low Power, Reducing Glitches, Logic Level
Power Optimization, Standby Mode Leakage Suppression, Variable Body Biasing, Sleep Transistors,
Dynamic Threshold MOS, Short Circuit Power Suppression.
way that oscillation around the MPPT will boost the PV System efficiency .An alternative ESC
approach is to introduce a small amount of perturbation into the control system. Maximum Power
Point Tracker (MPPT) using extremum seeking control algorithm with emphasis in solar
photovoltaic (PV) system. The ESC is better because of its low cost ,high efficiency and good
power factor . Ripple Correlation Control(RCC) is used in first stage high pass filter along with the
relational operator to develop the system .The reason for choose RCC is due to its high voltage gain
and a low input current ripple which minimizes the oscillation at the module operation point .A
major advantage of ESC is that it is capable of improving the system performance .The variable
phase MPPT power ripple feedback signal developed by sensing and multiplying together PV array
.The entire system is simulated using Matlab simulink environment .The system is expected to be
operated with high efficiency and low cost long life time .
performance of these systems depends on the throughput of the multiplier. Meanwhile, the negative bias
temperature instability effect occurs when a pMOS transistor is under negative bias increasing the
threshold voltage of the pMOS transistor, and reducing multiplier speed. A similar phenomenon, positive
bias temperature instability, occurs when an nMOS transistor is under positive bias. Both effects degrade
transistor speed, and in the long term, the system may fail due to timing violations. Therefore, it is
important to design reliable high-performance multipliers. In this paper, we propose an aging-aware
multiplier design with a razor based multiplier circuit. The multiplier is able to provide higher throughput
through the variable latency and can adjust the razor flip flop circuit to mitigate performance degradation
that is due to the aging effect. Moreover, the proposed architecture can be applied to a column- or rowbypassing multiplier. The experimental results show that our proposed architecture with 16 × 16 and 32 ×
32 column-bypassing multipliers can attain up to 62.88% and 76.28% performance improvement,
respectively, compared with 16×16 and 32×32 fixed-latency column-bypassing multipliers. Furthermore,
our proposed architecture with 16 × 16 and 32 × 32 row-bypassing multipliers can achieve up to 80.17%
and 69.40% performance improvement as compared with 16× 16 and 32 × 32 fixed-latency row-bypassing
multipliers.
hastened technological progression towards the integration of a variety of wireless access
technologies. Therefore one of the chief interest points of Next Generation Wireless Networks
(NGWNs), refers to the capability to support wireless network access equipments to guarantee a high
rate of services between dissimilar wireless networks. To answer these problems it is essential to
have decision algorithms to decide for every user of mobile terminal, which is the most excellent
network at some point, for a service or a precise application that the user needs. Therefore to make
these things, many algorithms use the vertical handoff technique. A series of algorithms based on
vertical handoff technique with a categorization of the different existing vertical handoff decision
strategies, which tries to resolve these issues of wireless network selection at a specified time for a
specific application of an user has been discussed in this paper. Also few parameters that are to be
considered during vertical handover have been discussed briefly.
developed from the Methyle alcohol from pongamia oils as an alternative diesel fuel. The major
problem of using neat pongamia oil as a fuel in a compression ignition engine arises due to its very
high viscosity. Transesterification with alcohols reduces the viscosity of the oil and other properties
have been evaluated to be comparable with those of diesel. In the present project work, an
experimental investigation is carried out on performance and emission characteristics of preheated
higher blends of pongamia biodiesel with diesel. The higher blends of fuel is preheated at 60, 75, 90
and 110˚C temperature using waste exhaust gas heat in a shell and tube heat exchanger.
Transesterification process is used to produce biodiesel required for the project from raw pongamia
oil. Experiments were done using B20 and B40 biodiesel blends at different preheating temperature
and for different loading. A significant improvement in performance and emission characteristics of
preheated B40 blend was obtained. B40 blend preheated to 110˚C showed maximum 8.72% and
8.97% increase in brake thermal efficiency over diesel and B20 blend respectively at 75% load. Also
the highest reduction in UBHC emission and smoke opacity values are obtained as 79.41% and
80.6% respectively over diesel and 78.12% and 73.54% respectively over B20 blend for B40 blend
preheated to 110˚C at 75% load. Thus preheating of higher blends of diesel and biodiesel at higher
temperature improves the viscosity and other properties sharply and improves the performance and
emission.
any major raw minerals from other state. The workable economic deposits of almost all minor and major
minerals located and also the state is reach in Power, Water and Human resources. Adequate quantity of
different kinds raw minerals are available for sustaining the conventional Industries like Thermal Power
Generation, Extraction, Cutting and Polishing units for Gem and Dimension Stones, Ancillary unit for
derived from the Cement and Iron Industries.
depending upon its behavioural conditions is a complex biological process studied through the use of
both in vivo and in vitro experimentation. Mathematical models provide the approach by using a
controlled environment in which a system can be described quantitatively. This can also yield data
which predicts the behaviour of cells and likely medical conditions after thorough analysis by the
modeller. In an effort to study the characteristics that increase cell fitness, the paper presents a 2D
Cellular Automaton model that uses computer simulation to describe the invasion of healthy tissue
by cancer cells. The growth process is simulated and it was found that movement of cells affects
tumour growth rate. It was also found that the relative distance of the tumour initiation area from
neighbouring vessels influences the growth of tumour. The model and the simulation software
developed thus can be used to understand the dynamics of early tumour growth and to explore
various hypotheses of tumour growth relevant to drug delivery in chemotherapy. Importantly, this
approach highlights that vessel displacement should not be neglected in tumour growth models. The
paper thus presents two models i.e cancer growth model and drug transport model for tumour growth
and treatment that will help to diagnose the early tumour growth. Though cancer is uncurable;early
and quick detection of cancer will help doctors in better way by suggesting quick remedial action
against it.