- 浏览: 742669 次
文章分类
最新评论
-
dfjjfxyl:
开源项目推荐网站:http://binlily.imwork. ...
JAVA开源项目 -
喵喵大神:
这类免费API还是挺多的,博客上也整理过:https://my ...
Web Api --智能Api接口
(8)基于hadoop的简单网盘应用实现4
文件结构
(1)、index.jsp首页面实现
index.jsp
<%@ include file="head.jsp"%> <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <%@page import="org.apache.hadoop.fs.FileStatus"%> <body style="text-align:center;margin-bottom:100px;"> <div class="navbar" > <div class="navbar-inner"> <a class="brand" href="#" style="margin-left:200px;">网盘</a> <ul class="nav"> <li><a href="LogoutServlet">退出</a></li> </ul> </div> </div> <div style="margin:0px auto; text-align:left;width:1200px; height:50px;"> <form class="form-inline" method="POST" enctype="MULTIPART/FORM-DATA" action="UploadServlet" > <div style="line-height:50px;float:left;"> <input type="submit" name="submit" value="上传文件" /> </div> <div style="line-height:50px;float:left;"> <input type="file" name="file1" size="30"/> </div> </form> </div> <div style="margin:0px auto; width:1200px;height:500px; background:#fff"> <table class="table table-hover" style="width:1000px;margin-left:100px;"> <tr style=" border-bottom:2px solid #ddd"> <td >文件名</td><td style="width:100px">类型</td><td style="width:100px;">大小</td><td style="width:100px;">操作</td><td style="width:100px;">操作</td> </tr> <% FileStatus[] list = (FileStatus[])request.getAttribute("list"); if(list != null) for (int i=0; i<list.length; i++) { %> <tr style="border-bottom:1px solid #eee"> <% if(list[i].isDir()) { out.print("<td> <a href=\"DocumentServlet?filePath="+list[i].getPath()+"\">"+list[i].getPath().getName()+"</a></td>"); }else{ out.print("<td>"+list[i].getPath().getName()+"</td>"); } %> <td><%= (list[i].isDir()?"目录":"文件") %></td> <td><%= list[i].getLen()/1024%></td> <td><a href="DeleteFileServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">x</a></td> <td><a href="DownloadServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">下载</a></td> </tr > <% } %> </table> </div> </body>
(2)document.jsp文件
<%@ include file="head.jsp"%> <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <%@page import="org.apache.hadoop.fs.FileStatus"%> <body style="text-align:center;margin-bottom:100px;"> <div class="navbar" > <div class="navbar-inner"> <a class="brand" href="#" style="margin-left:200px;">网盘</a> <ul class="nav"> <li class="active"><a href="#">首页</a></li> <li><a href="#">Link</a></li> <li><a href="#">Link</a></li> </ul> </div> </div> <div style="margin:0px auto; text-align:left;width:1200px; height:50px;"> <form class="form-inline" method="POST" enctype="MULTIPART/FORM-DATA" action="UploadServlet" > <div style="line-height:50px;float:left;"> <input type="submit" name="submit" value="上传文件" /> </div> <div style="line-height:50px;float:left;"> <input type="file" name="file1" size="30"/> </div> </form> </div> <div style="margin:0px auto; width:1200px;height:500px; background:#fff"> <table class="table table-hover" style="width:1000px;margin-left:100px;"> <tr><td>文件名</td><td>属性</td><td>大小(KB)</td><td>操作</td><td>操作</td></tr> <% FileStatus[] list = (FileStatus[])request.getAttribute("documentList"); if(list != null) for (int i=0; i<list.length; i++) { %> <tr style=" border-bottom:2px solid #ddd"> <% if(list[i].isDir()) { out.print("<td><a href=\"DocumentServlet?filePath="+list[i].getPath()+"\">"+list[i].getPath().getName()+"</a></td>"); }else{ out.print("<td>"+list[i].getPath().getName()+"</td>"); } %> <td><%= (list[i].isDir()?"目录":"文件") %></td> <td><%= list[i].getLen()/1024%></td> <td><a href="DeleteFileServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">x</a></td> <td><a href="DownloadServlet?filePath=<%=java.net.URLEncoder.encode(list[i].getPath().toString(),"GB2312") %>">下载</a></td> </tr> <% } %> </table> </div> </body> </html>
(3)DeleteFileServlet 文件
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; import com.sun.security.ntlm.Server; /** * Servlet implementation class DeleteFileServlet */ public class DeleteFileServlet extends HttpServlet { /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.rmr(filePath); System.out.println("===="+filePath+"===="); FileStatus[] list = hdfs.ls("/user/root/"); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
(4)UploadServlet文件
package com.controller; import java.io.File; import java.io.IOException; import java.util.Iterator; import java.util.List; import javax.servlet.ServletContext; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.servlet.jsp.PageContext; import org.apache.commons.fileupload.DiskFileUpload; import org.apache.commons.fileupload.FileItem; import org.apache.commons.fileupload.disk.DiskFileItemFactory; import org.apache.commons.fileupload.servlet.ServletFileUpload; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class UploadServlet */ public class UploadServlet extends HttpServlet { /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doPost(request, response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { request.setCharacterEncoding("UTF-8"); File file ; int maxFileSize = 50 * 1024 *1024; //50M int maxMemSize = 50 * 1024 *1024; //50M ServletContext context = getServletContext(); String filePath = context.getInitParameter("file-upload"); System.out.println("source file path:"+filePath+""); // 验证上传内容了类型 String contentType = request.getContentType(); if ((contentType.indexOf("multipart/form-data") >= 0)) { DiskFileItemFactory factory = new DiskFileItemFactory(); // 设置内存中存储文件的最大值 factory.setSizeThreshold(maxMemSize); // 本地存储的数据大于 maxMemSize. factory.setRepository(new File("c:\\temp")); // 创建一个新的文件上传处理程序 ServletFileUpload upload = new ServletFileUpload(factory); // 设置最大上传的文件大小 upload.setSizeMax( maxFileSize ); try{ // 解析获取的文件 List fileItems = upload.parseRequest(request); // 处理上传的文件 Iterator i = fileItems.iterator(); System.out.println("begin to upload file to tomcat server</p>"); while ( i.hasNext () ) { FileItem fi = (FileItem)i.next(); if ( !fi.isFormField () ) { // 获取上传文件的参数 String fieldName = fi.getFieldName(); String fileName = fi.getName(); String fn = fileName.substring( fileName.lastIndexOf("\\")+1); System.out.println("<br>"+fn+"<br>"); boolean isInMemory = fi.isInMemory(); long sizeInBytes = fi.getSize(); // 写入文件 if( fileName.lastIndexOf("\\") >= 0 ){ file = new File( filePath , fileName.substring( fileName.lastIndexOf("\\"))) ; //out.println("filename"+fileName.substring( fileName.lastIndexOf("\\"))+"||||||"); }else{ file = new File( filePath , fileName.substring(fileName.lastIndexOf("\\")+1)) ; } fi.write( file ) ; System.out.println("upload file to tomcat server success!"); System.out.println("begin to upload file to hadoop hdfs</p>"); //将tomcat上的文件上传到hadoop上 String username = (String) request.getSession().getAttribute("username"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.copyFile(filePath+"\\"+fn, "/"+username+"/"+fn); System.out.println("upload file to hadoop hdfs success!"); System.out.println("username-----"+username); FileStatus[] list = hdfs.ls("/"+username); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request, response); } } }catch(Exception ex) { System.out.println(ex); } }else{ System.out.println("<p>No file uploaded</p>"); } } }
(5)DownloadServlet文件
package com.controller; import java.io.BufferedInputStream; import java.io.BufferedOutputStream; import java.io.File; import java.io.FileInputStream; import java.io.IOException; import java.io.InputStream; import javax.servlet.ServletException; import javax.servlet.ServletOutputStream; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class DownloadServlet */ public class DownloadServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String local = "C:/"; String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); System.out.println(filePath); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); hdfs.download(filePath, local); FileStatus[] list = hdfs.ls("/user/root/"); request.setAttribute("list",list); request.getRequestDispatcher("index.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
(6)DocumentServlet文件
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.mapred.JobConf; import com.model.HdfsDAO; /** * Servlet implementation class DocumentServlet */ public class DocumentServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { String filePath = new String(request.getParameter("filePath").getBytes("ISO-8859-1"),"GB2312"); JobConf conf = HdfsDAO.config(); HdfsDAO hdfs = new HdfsDAO(conf); FileStatus[] documentList = hdfs.ls(filePath); request.setAttribute("documentList",documentList); request.getRequestDispatcher("document.jsp").forward(request,response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
package com.controller; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpSession; /** * Servlet implementation class LogoutServlet */ public class LogoutServlet extends HttpServlet { private static final long serialVersionUID = 1L; /** * @see HttpServlet#doGet(HttpServletRequest request, HttpServletResponse response) */ protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { HttpSession session = request.getSession(); session.removeAttribute("username"); request.getRequestDispatcher("login.jsp").forward(request, response); } /** * @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse response) */ protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException { this.doGet(request, response); } }
到此,一个简单的基于hadoop的网盘应用就完成了,如果想把它做的更像一个真正的网盘,大家可以花多点时间去实现剩下的功能。
源代码下载地址:http://download.csdn.net/detail/wen294299195/7779949
相关推荐
基于hadoop的网盘应用
Hadoop是Apache的一款开源框架,使用java语言编写,可以通过编写简单的程序来实现大规模数据集合的分布式... Hadoop 采用的是Apache v2协议,Hadoop基于Google发布的MapReduce论文实现,并且应用了函数式编程的思想。
应用部署服务器:SpringBoot内置Tomcat插件 Node服务器:Node v10.15.3 数据库:Mysql v5.5.59 缓存服务:Redis v2.8.9 代码仓库管理系统:GitHub 服务器环境:处理器Core i5以上 2.2 基本处理流程 企业网盘...
《Hadoop实战》作为云计算所青睐的分布式架构,Hadoop是一个用Java语言实现的软件框架,在由大量计算机组成的集群中运行海量数据的分布式计算,是谷歌实现云计算的重要基石。《Hadoop实战》分为3个部分,深入浅出地...
资源名称:颠覆大数据分析 基于StormSpark等Hadoop替代技术的实时应用 内容简介:Vijay Srinivas Agneeswaran 博士,1998 年于SVCE 的马德拉斯分校获得计算机科学与工程专业的学士学位,2001 年获取了印度理工学院...
基于SpringCloud+Hadoop+Vue的企业级网盘系统设计与实现(含毕业论文资料+优秀毕业设计) 应用组成 前端:vue-projectManage 后台:mycloud-admin 提供前端服务:mycloud ps:springcloud实现 文件在线预览服务:file...
管理Hadoop 1478.1 为实际应用设置特定参数值 1478.2 系统体检 1498.3 权限设置 1518.4 配额管理 1518.5 启用回收站 1528.6 删减DataNode 1528.7 增加DataNode 1538.8 管理NameNode和SNN 1538.9 ...
毕业设计,基于SpringBoot+Hadoop+Vue开发的企业级网盘分布式系统,被评为优秀毕业设计,包含毕业论文、查重报告等 2.1 运行环境 编程语言:Java、Mybatis、Spring、SpringBoot、SpringCloud、Node、Vue 开发环境...
企业网盘系统的使用者分为企业普通员工和企业管理员,所以具体流程是不一样的。 企业普通员工进入本系统前台主界面后看到的是首页数据,系统右上角有用户的头像和系统公告通知。在首页顶部的位置有个欢迎用户功能,...
基于SpringBoot+Hadoop+Vue开发的企业级网盘分布式系统+优秀本科毕业设计+论文.zip总体设计,运行环境: 编程语言:Java、Mybatis、Spring、SpringBoot、SpringCloud、Node、Vue 开发环境:Windows 10 + Mysql 开发...
# 基于SpringCloud+Hadoop+Vue的企业级网盘系统设计与实现 ## 二、总体设计 #### 2.1 运行环境 ```lua 编程语言:Java、Mybatis、Spring、SpringBoot、SpringCloud、Node、Vue 开发环境:Windows 10 + Mysql 开发...
第四章 数据湖基于Hadoop、Spark的实现 第五章 Delta Lake - 数据湖核心的增强 第六章 Delta Lake - Quickstart 第七章 Delta Lake 操作 第八章 Delta Lake - 理论 第九章 企业数据湖应用案例分析 第十章 基于AWS的...
111计算器系统的组成—硬件部分1.1计算机系统介绍aip,网盘文件,永久连接 1.1.2计算器系统的组成—软件部分11计算机系统介绍zip 1.2.1鲲鹏生态介绍1.2鲲鹏处理器及关键硬件特性介绍,zip 天1.2.2鲲鹏处理器介绍1.2...
采用Kudu技术,Kudu是开源的运行在Hadoop平台上的列式存储系统,拥有Hadoop生态系统应用的常见技术特性,运行在一般的商用硬件上,支持水平扩展,高可用。 基于高德地图API打造自有商圈库,方便管理,更新,基于商圈库...
手把手视频详细讲解项目开发全过程,需要的小伙伴自行百度网盘下载,链接见附件,永久有效。 课程简介 从零开始讲解大数据业务及数据采集和迁移需求,以案例驱动的方式讲解基于Sqoop构建高性能的分布式数据迁移和...
从实际企业需求角度出发,引入Hue的实际开发应用场景,基于Hue构建统一化的大数据集中式开发管理平台,并基于Hue构建可视化分析 课程亮点 1,知识体系完备,从小白到大神各阶段读者均能学有所获。 2,生动形象,化...
基于Hadoop技术的医院大数据分析平台自下而上 分为三个部分,分别为:数据层、大数据釆集与存储、数 据分析与展示。 数据层针对不同系统进行分析,制定系统数据采集 范围与目标,收集医院在日常管理和医疗工作中...
手把手视频详细讲解项目开发全过程,需要的小伙伴自行百度网盘下载,链接见附件,永久有效。 课程简介 从零开始讲解大数据分布式计算的发展及Impala的应用场景,对比Hive、MapReduce、Spark等类似框架讲解内存式计算...
手把手视频详细讲解项目开发全过程,需要的小伙伴自行百度网盘下载,链接见附件,永久有效。 课程简介 从零开始讲解大数据调度系统构成,集成大数据计算任务构建大数据工作流,基于Oozie构建实现企业级自动化任务...
基于SpringCloud+Hadoop+Vue的企业级网盘系统设计与实现 【2019届毕业设计】,【优秀毕业设计】,【华东交通大学】 一、应用组成 前端:vue-projectManage 后台:mycloud-admin 提供前端服务:mycloud ps:spring...