1

I am trying to copy a large amount files to Hadoop HIVE. Right now I am doing this via a two-step scripts.

  • Step 1: Powershell scripts copying files from Windows to Linux using Putty scp tool.

  • Step 2: Bash scripts copying files from Linux to HDFS using Hadoop put command.

Is there a way to do this via just one step?

Seaport
  • 23
  • 1
  • 1
  • 6
  • I tried out HortonWorks' Sandbox. Via Ambari I can upload files (one file a time) directly into HDFS. I wonder whether HortonWorks provide any tool to automate this process. – Seaport Mar 01 '19 at 00:04

1 Answers1

0

I figured out one solution. I mounted the windows share to the Linux box so I could treat the window share as a local Linux path, then I copied files directly from that path to a HDFS location.

Seaport
  • 23
  • 1
  • 1
  • 6