sqoop将mysql表导入hdfs

2021/6/6 19:24:06

本文主要是介绍sqoop将mysql表导入hdfs,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!

sqoop将mysql表导入hdfs


sqoop help

  codegen            Generate code to interact with database records
  create-hive-table  Import a table definition into Hive
  eval               Evaluate a SQL statement and display the results
  export             Export an HDFS directory to a database table
  help               List available commands
  import             Import a table from a database to HDFS
  import-all-tables  Import tables from a database to HDFS
  import-mainframe   Import datasets from a mainframe server to HDFS
  job                Work with saved jobs
  list-databases     List available databases on a server
  list-tables        List available tables in a database
  merge              Merge results of incremental imports
  metastore          Run a standalone Sqoop metastore
  version            Display version information
sqoop import --connect jdbc:mysql://cdh3:3306/gmall --username root --password 123456 --table user_info 
--columns id,login_name where "id >=10 and id<=30" --target-dir /test --delete-target-dir

1 --connect jdbc:mysql://cdh3:3306/gmall 数据库的连接信息
2 --username root --password 123456 数据库的用户名密码
3 --table user_info 数据库的表
4 --columns id,login_name 数据库列,指定导入哪几列
5 where “id >=10 and id<=30” 指定条件,导入id大于10小于30的数据
6 --target-dir /test 指定hdfs目录
7 --delete-target-dir 如果输出路径存在则删除



这篇关于sqoop将mysql表导入hdfs的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!


扫一扫关注最新编程教程