the characteristics and structure of the B+ tree, also mentioned the one-dimensional data index structure in the database.
When dealing with large amounts of data, processing efficiency can be compromised due to high server load.Therefore, this paper presents proposes an incremental association rule mining algorithm based on Hadoop load balancing. The implementation of this algorithm uses the Map/ Reduce programming model to parallelize the mining tasks, and uses load balancing to mine frequent
International Journal of Advanced Network, Monitoring and Controls , ISSUE 4, 7–16
With fast development and deep appliance of the Internet, problem of mass image data storage stand out, so the problem of low management efficiency, low storage ability and high cost of traditional storage framework has appeared. The appearance of Hadoop provides a new thought. However, Hadoop itself is not suit for the handle of small files. This paper puts forward a storage framework of mass image files based on Hadoop, and solved the internal storage bottleneck of NameNode when small files
International Journal of Advanced Network, Monitoring and Controls , ISSUE 2, 80–83
With the increasing popularity of open-source platform Hadoop, the meteorological industry is available to create a Meteorological Cloud (MeteCloud) platform to store and deploy applications. In this paper, we propose an idea to build the MeteCloud platform for meteorological departments using Hadoop. We also present a backup policy for meteorological data. In addition, one kind of storage process of the meteorological A-format file is presented. Furthermore, we experiment with one-year
International Journal on Smart Sensing and Intelligent Systems , ISSUE 2, 648–663
set must not be less than the minimum support degree; Secondly it enters into the process of strong association rule generation, and the rules need to satisfy the support and confidence thresholds at the same time; Thirdly, only all rules that contain collection items are retained. Once these rules are retained and generated, that are greater than or equal to the MinConfidence.
The design of the Hadoop framework originated was from an open source project developed by the Apache organization
International Journal of Advanced Network, Monitoring and Controls , ISSUE 3, 100–105
Big data storage management is one of the most challenging issues for Hadoop cluster environments, since large amount of data intensive applications frequently involve a high degree of data access locality. In traditional approaches high-performance computing consists dedicated servers that are used to data storage and data replication. Therefore to solve the problems of Disparateness among the jobs and resources a “Disparateness-Aware Scheduling algorithm” is proposed in the
E. Laxmi Lydia,
International Journal of Advanced Network, Monitoring and Controls , ISSUE 2, 34–46
In order to solve the problem of information overload of music system under large data background, this paper studies the design scheme of distributed music recommendation system based on Hadoop. The proposed algorithm is based on the MapReduce distributed computing framework, which has high scalability and performance, and can be applied to the calculation and analysis of off-line data efficiently. The music recommendation system designed in this paper also includes client, server interface
International Journal of Advanced Network, Monitoring and Controls , ISSUE 2, 126–132
large amounts of data based on the Hadoop system by using its distributed storage and parallel processing mechanism.
International Journal of Advanced Network, Monitoring and Controls , ISSUE 4, 26–30
Face Recognition being one of the methods in identifying individuals is getting enhanced at a faster rate. This paper demonstrates the process of detection of faces of the individuals through a live monitoring camera using matlab and also aids in tracking them. The large amount of images being collected at each second is stored in big databases like Hadoop- databases(hbase) or Mongodb as they are known for their higher processing speed. The facial features are extracted from all the images and
P.J Leo Evenss,
Jennings Mcenroe .S,
International Journal on Smart Sensing and Intelligent Systems , ISSUE 5, 163–173
significance to design a big data model based on collaborative filtering. And necessary. If you want to process big data, a single computer cannot be realized, so the application of distributed architecture is particularly important. So the algorithm model is run under the Hadoop distributed framework. MapReduce is a distributed computing framework under Hadoop . It uses the “divide and conquer” idea to decompose complex tasks or data into several simple tasks for parallel processing. Afterwards, it
International Journal of Advanced Network, Monitoring and Controls , ISSUE 3, 1–8