1. 基于 HDFS 的大数据文件传输实验设计.
- Author
-
刘文杰
- Subjects
- *
DISTRIBUTED computing , *COMPUTING platforms , *BIG data , *COLLEGE campuses , *ELECTRONIC data processing - Abstract
With the development of cloud computing application technology and related research, cloud programming model has also a new technological innovation. In college campus network experimental teaching system, cloud platform experiment has become main content for big data analysis. So, by using HDFS struck, building a stable, practical experiment platform to meet the experiment course system becomes a new topic of the experimental study of campus network in colleges and universities. In this paper, we use open source cloud computing platform Hadoop as the basic platform for big data analysis experiment. The basic experimental platform is used for data processing optimization. HDFS provides the underlying application support for distributed computing storage, and realizes communication between NameNode and DataNode. The user file is stored into the node through the data block, so that the read and write requests of the client can be processed timely, and the data block can be created, deleted, replicated and mapped with a unified schedule of NameNode. At the same time, we can target the experiment process according to experiment methods. [ABSTRACT FROM AUTHOR]
- Published
- 2019