تقييمات الطلاب
( 5 من 5 )
١ تقييمات
فيديو شرح 2.9. Big Data Introduction What Is Apache Hadoop? ضمن كورس Apache Hadoop شرح قناة CloudxLab Official، الفديو رقم 10 مجانى معتمد اونلاين
Hadoop was created by Doug Cutting in order to build his search engine called Nutch. He was joined by Mike Cafarella. Hadoop was based on the three papers published by Google: Google File System, Google MapReduce, and Google Big Table.
It is named after the toy elephant of Doug Cutting's son.
Hadoop is under Apache license which means you can use it anywhere without having to worry about licensing.
It is quite powerful, popular and well supported.
It is a framework to handle Big Data.
Started as a single project, Hadoop is now an umbrella of projects. All of the projects under the Apache Hadoop umbrella should have followed three characteristics:
1. Distributed - They should be able to utilize multiple machines in order to solve a problem.
2. Scalable - If needed it should be very easy to add more machines.
3. Reliable - If some of the machines fail, it should still work fine.
These are the three criteria for all the projects or components to be under Apache Hadoop.
Hadoop is written in Java so that it can run on kinds of devices.
This Big Data Tutorial will help you learn HDFS, ZooKeeper, Hive, HBase, NoSQL, Oozie, Flume, Sqoop, Spark, Spark RDD, Spark Streaming, Kafka, SparkR, SparkSQL, MLlib, and GraphX from scratch. Everything in this course is explained with the relevant example thus you will actually know how to implement the topics that you will learn in this course.
Let us know in the comments below if you find it helpful.
In order to claim the certificate from E&ICT Academy, IIT Roorkee, visit https://bit.ly/cxlyoutube
________
Website https://www.cloudxlab.com
Facebook https://www.facebook.com/cloudxlab
Instagram https://www.instagram.com/cloudxlab
Twitter http://www.twitter.com/cloudxlab