Automated methodology to reduce the redundancy in relational database
Normalization is a correct process for good relational database design of an application but not a complete. To support this, in a Boyce-Codd normal form design, three relations are designed from the two functional dependencies. Clearly, this is a setback in the basic aim of normalization i.e. reduction in database redundancy. The indecision here is that can we reduce the database redundancy further? Secondly, the database designers are creating the relations from the set of attributes and functional dependencies existing among them. The human work in the design process may lead to ambiguity and incorrect relations when the set of attributes and functional dependencies are large. The design of automated process for the human work overcomes this lacuna. The researchers have shown that there is a natural correspondence between the hypergraph and relational database schema. Beeri .et .al introduced a special class of hypergraphs known as ‘acyclic’, where the properties of relational database are equivalent to ‘ -acyclicity’. Another researcher Fagin introduced ‘ -acyclicity, which establishes the condition of unique relationship among the attributes. Hence, the amelioration of hypergraph from cyclic to acyclic satisfies the properties of relational database. Our paper proposes a methodology that takes the functional dependencies, attributes set as an input, and identifies the candidate key attributes. From the candidate key attributes, the hypergraph is redefined for the ‘acyclicity’ by the isolation of functional dependency (ies) framed by the candidate key attributes.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Natural Language Interface to Databases: Development Techniques
In the world of computing, information plays an important role in our lives. One of the major sources of information is database. Database and Database technology are having major impact on the growing use of computers. Almost all IT applications are storing and retrieving the information or data from the database. Database Management Systems (DBMS) have been widely used for storing and retrieving data. However, databases are often hard to use since their interface is quite rigid in co-operating with users. For storing and retrieving the information from database requires the knowledge of database language like SQL. Structured Query Language (SQL) is an ANSI standard for accessing and manipulating the information stored in database. However, everyone may not be able to write the SQL query as they may not be aware of the syntax and structure of SQL and database respectively. The purpose of Natural Language Interface is to allow users to compose questions in Natural Language and receive the response also in Natural Language. The idea of using Natural Language instead of SQL has promoted the development of new type of processing called Natural Language Interface to Database(NLIDB). This paper discuss about an introduction of Intelligent Database System, Natural Language Processing and Natural Language Interface to Database. It also gives a brief overview of subcomponent of NLIDB, techniques used to development of NLIDB along with its architecture.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Review of security issues in cloud computing related to Single and Multi-clouds
The use of cloud computing has accrued quickly in several organizations. Cloud computing provides several edges in terms of low price and accessibility of information. making certain the safety of cloud computing could be a major think about the cloud computing atmosphere, as users typically store sensitive data with cloud storage suppliers however these suppliers is also untrusted. Addressing “single cloud” suppliers is foretold to abate fashionable customers owing to risks of service handiness failure and also the risk of malicious insiders within the single cloud. A movement towards “multi-clouds”, or in different words, “interclouds” or “cloud-of-clouds” has emerged recently. This paper surveys recent analysis associated with single and multi-cloud security and addresses doable solutions. it\'s found that the analysis into the utilization of multi-cloud suppliers to take care of security has received less attention from the analysis community than has the utilization of single clouds. This work aims to market the utilization of multi-clouds owing to its ability to cut back security risks that have an effect on the cloud computing user.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A 3d stacked mesh NoC for reliable inter-layer communication and congestion reduction
The increasing viability of 3D silicon integration technology has opened new opportunities for chip architecture innovations. One direction is in the extension of 2D mesh based chip multiprocessor architecture into three Dimensions. We present an efficient architecture to optimize system performance, power consumption, and reliability of stacked mesh 3D NoC is proposed. Stacked mesh is a feasible architecture which takes advantage of the short inter-layer wiring delays, while suffering from inefficient intermediate buffers. To cope with this, an inter-layer communication mechanism is developed to enhance the buffer utilization, load balancing, and system fault-tolerance. The mechanism benefits from a congestion-aware and bus failure tolerant routing algorithm for vertical communication.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Designing Association Models for Disease Prediction using Apriori
Data mining is has three major components Clustering or Classification, Association Rules and Sequence Analysis. Association rules used to find interesting relationship between attribute values. Sequence Analysis used to find statistically relevant patterns between data. Data mining techniques have led over various methods to gain knowledge from vast amount of data. Association rules are mainly used in mining transaction data to find interesting relationship between attribute values and also it is a main topic of data mining There is a great challenge in candidate generation for large data with low support threshold. Association rules will be effectively worked with the solid data and low support threshold was discussed. By using Apriori algorithm we applied association rules on data set of certain areas to predict the chance of getting the dengue disease, the above data set was collected from some selected areas, so it is the real time data. Three different sets of rules are generated with this dataset and applied the Apriori algorithm to it, find the relation between the parameters in database.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Descriptive approach to software development life cycle models
The concept of system lifecycle models came into existence that emphasized on the need to follow some structured approach towards building new or improved system. Many models were suggested like waterfall, prototype, rapid application development, V-shaped, top & Bottom model etc. In this paper, we approach towards the traditional as well as recent software development life cycle models.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Analyzing blood cell images to differentiate WBC and counting of linear & non-linear overlapping RBC based on morphological features
In this paper we propose a new set of features based on morphology for total RBC count in given blood sample is explained along with classification of WBC. For the diagnosis of any disease first requirement is complete blood count i.e total number of RBC, WBC and platelets in given blood sample. And if there are excess of any of these types or any of these is few in number then this gives the doctor a basic idea that the person is not healthy for sure. To do this manually is very tedious. Here RBC and WBC are first differentiated and then counted through some algorithm. Implementation here is done through MATLAB.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Simulation of Seasonal precipitation using ANN and ARIMA Models: A case study of (Iran) Khozestan
Accurate rainfall prediction is of great interest for water management and flood control. In reality, physical processes influencing the occurrence of rainfall are highly complex, uncertain and nonlinear. In this paper, we present tools for modeling and predicting the behavioral pattern in rainfall phenomena based on past observations. The aim of this paper is to predict the seasonal rainfall of (Iran) khozestan using artificial neural network (ANN) and autoregressive integrated moving average (ARIMA) models. In order to evaluate the prediction efficiency, we made use of 33 years of seasonal rainfall data from year 1976 to 2008 of Khozestan Province (Iran). The models were trained with 28 years of seasonal rainfall data. The ANN and the ARIMA approaches are applied to the data to derive the weights and the regression coefficients respectively. The performance of the model was evaluated by using remaining 5 years of data. The study reveals that ANN model can be used as an appropriate forecasting tool to predict the rainfall, which out performs the ARIMA model.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Ontological Reliability Quantification Method
Software reliability quantification plays a very significant role for software consistency and excellence. However, the conventional software quantification method mostly focuses on evaluation by use of failure data which is gained only after testing or usage in the late phase of the software life cycle. Therefore, to obtain and quantify the software reliability with the help of architecture style may be introduced. Ontology allows developers and users to better understand software architecture and reliability terminologies, assess software reliability, and communicate effectively with the software reliability engineers. Therefore, an Ontological Reliability Quantification Method (ORQM) is instigated in this paper, which focuses on various project categories correlative with architecture style and concerned project parameters. Finally, some case studies are presented to demonstrate the viability of this method.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Secured medical image transmission using chaotic map
Image cryptography and Steganography has attracted extensive research on the security of message that is to be transmitted in the open insecure medium. This is due to the fact that huge amounts of data can be hidden without perceptible impact to the carriers and possibly because of the popularity of electronic images and medical images that have become widely available. The chaotic based secret writing has its own advantage and it is mainly based on the initial condition which is the secret key for the secret writing. The chaotic based encryption serves as the robust mechanism against all sorts of attacks. In this paper, a novel image encryption and decryption scheme is proposed. Due to sensitivity to initial conditions, chaotic maps have a good potential for designing dynamic permutation map. Here a chaotic Henon map is used to generate permutation signal. Simulation results illustrate that the scheme is highly key sensitive and shows a good resistance against brute-force and statistical attacks.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]