Concurrency control technique for enhancing RDBMS efficiency, performance and activity
The Efficiency, Performance and Activity of a RDBMS must be consistent. This consistency is achieved through Concurrency Control. Concurrency control deals with the issues involved with allowing multiple people simultaneous access to shared entities, be they objects, data records, or some other representation. The Concurrency Control mechanism ensures that the database maintains its consistency. Concurrency control of distributed transactions requires a distributed synchronization algorithm, which has to ensure that concurrent transactions are not only serializable at each site where they execute, but that they are also globally serializable. The advanced protocols put forward are focused on increasing the level of concurrency and simultaneously decreasing the rate of transaction restarts. Some protocols also compromise consistency in order to achieve concurrency. Several Concurrency Control algorithms have been suggested and implemented and used in a variety of real world applications. This article reveals the analysis of various Concurrency Control Techniques on the database systems and further discusses some of the Hybrid Techniques, which provide efficient Concurrency Control on database systems.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Enhancing to detect face automatically from input videos using multiple algorithms
This research work proposes a novel technique for Automatic face recognition (AFR) system using cascaded structures and clustering network. Human beings at a very early age are capable of recognizing the varying facial features, due to the Human Visual System (HVS). But it’s difficult to depict the human visual system using computer vision system. The basic idea used in this proposed work is to use divide and conquer method, where we design a particular approach for each processing stage and then embedding the entire strategy for AFR system. In this proposed work, two important factors namely cost efficiency and applied technology for varying characteristics of input image are considered respectively, irrespective of the traditional factors such as accuracy, retrieving rate etc. For facial detection, a heterogeneous cascaded detector based on various features is designed to increase the processing capability and detecting efficiency respectively. For facial feature extraction, sparse graph, component based direct fitting and component based texture fitting methods are used to extract the various features at different orientations.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Novel Incremental ID3 Algorithm for Classification
Discretization transforms a continuous attribute values into a finite number of intervals and associates with each interval a numerical, discrete value. For mixed-mode (continuous and discrete) data, discretization is usually performed prior to the learning process and plays an important role in KDD process. CAIM is very efficient, supervised discretization algorithm. Recent data mining technology is found to be slow to handle data of very large scale. In addition, data mining needs to be a continuous, online process, rather than an occasional one-shot process which has created a need for incremental approach for effective model preparation and updating. Incremental classification is proposed in literature needs online discretization, has created a need for fast and efficient discretization algorithm. The Modified CAIM (MoCAIM) algorithm is proposed and used as online discretization for the Novel ID3 (NID3) algorithm where CAIR is used as attribute selection criteria and CAIM is used for online discretization. Improved NID3 (INID3) is proposed to improve the classification accuracy by considering the unclassified instances of the test and predication phases of the classification process. The outperforming results of MoCAIM and INID3 algorithms in term of classification accuracy and Execution time motivate to explore the process further for the streamed data classification.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Together Application-Health Diary app for caregivers of Alzheimer's patients Together Application-Health Diary app for caregivers of Alzheimer's patients
Alzheimer’s patients are experiencing a continuous increase around the world, highlighting the challenge of ensuring consistent quality of care for all patients. This project is studying the daily care needs of Alzheimer’s patients and medical care providers. It aims to address this challenge by creating a mobile application to follow up on the daily health developments of Alzheimer’s patients and record them through care providers. The care aims of this project is to highlight the development of a mobile app, which based on scientific and medical sources, aims to help Alzheimer’s disease care providers to render their services in a modern, easy, and effective way. This app is also designed to help save time, money, and effort. As a result, this project has several specific objectives it seeks to achieve including the design, facilitation of use, quality improvement, and support services for care providers and doctors.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
DSP algorithm for music-less audio stream generation
In this paper we investigate the problem of separation of human voice from a mixture of voice and different music instruments. The human voice may be a part of singing voice in a song or it may be a part of some news broadcasted by a channel and it contains background music. The final outcome of this work is a file containing only vocals. In our approach we consider stereo audio for separation. We process the signal in time frequency domain. In our method of blind source separation we processed the input stereo audio file in the form of frames, windowed them and then applied discrete Fourier transform (DFT) on signal. Then the signal is masked for de-mixing purpose using time frequency filters and non-zero DFT coefficients that are estimated as a part vocals are selected and signal is reconstructed by overlap add (OLA) method to get the final output signal containing only vocals.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Generated a geophysical based computer code for estimation of crustal thickness - A case study
In this study an extended geophysical survey using gravity and high angle refracted seismic methods with recorded earthquake has been carried out in Kerman province in southeast of Iran. The purpose of the applied geophysical surveys was to provide information on crustal structure and lithospheric thickness. The purpose of this research is focused on determination of Moho depth and for this reason several profiles for both methods were performed and by use of topography and gravity data with a reasonable combination with high angle refracted seismic data and earthquake record, the crustal and lithospheric thickness of Kerman province was calculated. Oldenburg- Parker algorithm was the base of gravity survey of the present work but by use of Matlab programming environment and generated C Sharp computer codes was optimized, improve and then executed. The key factor of this study was generated computer code, which named as “MOHO R.A.T 1.01”. For seismic data, a least-squares analysis of the travel-time data was made and the uncertainties were taking in to account. Finally, depth calculations for the velocity discontinuities and gravity anomaly contour maps were made. The comprehensive comparison between the obtained results and other studies on the selected area showed good agreement, which verified the ability of produced code.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Performance analysis of TCP and TCP- Fin Ad-Hoc network
Many applications use TCP as the transport layer for reliable data transfer for wireless connections to integrate seamlessly into the Internet. Some of the assumptions made during the design of traditional TCP may not be suitable for an infrastructure-less network environment, because TCP invokes the congestion control mechanism even if the packet loss is due to the link failure. On the other hand TCP-F that is able to distinguish link failure from congestion through feedback from the intermediate nodes and takes appropriate action. This paper compares the performance of TCP-F with traditional TCP through simulation.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Sensor data analysis and management in wireless sensor networks
Harvesting the benefits of a sensor-rich world presents many data analysis and management challenges. Recent advances in research and industry aim to address these challenges. Modern sensors and information technologies make it possible to continuously collect sensor data, which is typically obtained as real-time and real valued numerical data. Examples include vehicles driving around in cities or a power plant generating electricity, which can be equipped with numerous sensors that produce data from moment to moment. Though the data gathering systems are becoming relatively mature, a lot of innovative research needs to be done on knowledge discovery from these huge repositories of data. The data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Analysts need improved techniques are needed to reduce an analyst’s decision response time and to enable more intelligent and immediate situation awareness. Faster analysis of disparate information sources may be achieved by providing a system that allows analysts to pose integrated queries on diverse data sources without losing data provenance. This paper proposed to develop abstractions that make it easy for users and application developers to continuously apply statistical modeling tools to streaming sensor data. Such statistical models can be used for data cleaning, prediction, interpolation, anomaly detection and for inferring hidden variables from the data, thus addressing many of the challenges in analysis and managing sensor data. Current archive data and streaming data querying techniques are insufficient by themselves to harmonize sensor inputs from large volumes of data. These two distinct architectures (push versus pull) have yet to be combined to meet the demands of a data-centric world. The input of sensor streaming data from multiple sensor types further complicates the problem.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Heterogeneous Framework for Indian Cybercrime Cases
The internet has served as the global language of the virtual world since the beginning of the digital world and the Internet in India is increasing rapidly .Internet served seriously in many more areas in current scenario like Trade, education, sports, and research. Internet is treated as coin because it having two sides merits and demerits. A major problem of internet in current scenario is “Cyber Crime”. Here we study and analysis about the current cyber crimes in India through the literature survey, government annual reports, verbal communication with ethical hackers and many more techniques like Questionnaires and from the head of cyber cell from different states in India. We also study and analysis the different cyber crimes case studies which is occurred in different states in India and Indian cyber laws framework is compared with other country framework. So, finally we conclude that, our cyber laws frameworks having some of the vulnerabilities and problems for executing cyber cases compare to other countries. So we propose a heterogeneous approach or model for executing our legal framework smoothly against the cybercrimes in current scenario.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Semantic web modeling of a high school’s information system along with sparql queries
In the first part of this work we will present the modelling of a high school information system with the use of WebProtege. System ontologies and class properties will be presented. In the second part we will present an introduction for SPARQL and examples of queries that were made, with the results returned to us.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]