Home > Common Problem > body text

What are the three major components of hadoop

coldplay.xixi
Release: 2023-01-13 00:34:37
Original
42923 people have browsed it

Hadoop has three major components: 1. HDFS, a highly reliable, high-throughput distributed file system; 2. MapReduce, a distributed offline parallel computing framework; 3. Yarn, a distributed resource management framework .

What are the three major components of hadoop

#The operating environment of this article: Windows 7 system, Dell G3 computer.

Three major components of hadoop:

1. HDFS

A highly reliable, high-throughput distributed file system

Storing massive data

Distributed

Security

Copy data

Data is stored in blocks, 128M

For example: 200M—128M 72M

2. MapReduce

A distributed offline parallel computing framework

For massive data Processing

Distributed

Ideology:

Divide and Conquer

Large data set is divided into small data sets

Each data set Carry out logical business processing (map)

Merge statistical data results (reduce)

3, Yarn

Distributed resource management framework

Manage the resources of the entire cluster (memory, CPU cores)

Allocate and schedule the resources of the cluster

Related video recommendations: PHP programming from entry to proficiency

The above is the detailed content of What are the three major components of hadoop. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!