Welcome to the Simplilearn Community

Want to join the rest of our members? Sign up right away!

Sign Up

Big Data Hadoop & Spark Developer | Amit Mishra | 04 Sep - 16 Oct 21

Mohan reddy_8

Administrator
Staff member
Simplilearn Support
Alumni
Trainer
Customer
Hi Learners,

Greetings!
Hope you all are doing well.

This is your platform for discussion.

Regards
Mohan Reddy
Simplilearn
 

Anita_50

Member
hi i am getting error while running this command-

sqoop import --connect jdbc:mysql://sqoopdb.slbdh.cloudlabs.com/anitasingh14121996gmail --usernamae anitasingh14121996gmail --password anitasingh14121996gmailh6cly --table emp576ok -m 1 --target-dir '/user/anitasingh14121996gmail/employee'
 

Anita_50

Member
hi i am getting error while running this command-

sqoop import --connect jdbc:mysql://sqoopdb.slbdh.cloudlabs.com/anitasingh14121996gmail --usernamae anitasingh14121996gmail --password anitasingh14121996gmailh6cly --table emp576ok -m 1 --target-dir '/user/anitasingh14121996gmail/employee'
error is-
21/09/12 09:45:22 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7-cdh6.3.221/09/12 09:45:22 ERROR tool.BaseSqoopTool: Error parsing arguments for import:21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: --usernamae21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: anitasingh14121996gmail21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: --password21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: anitasingh14121996gmailh6cly21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: --table21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: emp576ok21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: -m21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: 121/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: --target-dir21/09/12 09:45:22
ERROR tool.BaseSqoopTool: Unrecognized argument: /user/anitasingh14121996gmail/employe
 
What will happen if secondary namenode goes down?
Secondary name node is not playing any active role in the cluster, its purpose is simply to get the edit log and create the FSImage from active node. Only if active namenode goes down then the FSImage is copied from secondary name to active namenode.
So there will be no impact if secondary namenode goes down, it just that last edit log will not copied.
 
Please suggest on below questions.
Q1> If I have a table in Hive, how can I import data into hbase table?

Q2> Create a data pipeline using Sqoop, Flume to pull the table data below from the MYSQL server into the HBase table with one column family.
So first, I will have to export data using Sqoop from MYSQL Table into directory.(what if I already have same data in Hive Table)
Secondly, create table in hbase
Third, write a flume agent to read data from directory and insert into hbase( Any other way?)
 
Last edited:

Braj Mohan Sharma

Customer
Customer
Secondary name node is not playing any active role in the cluster, its purpose is simply to get the edit log and create the FSImage from active node. Only if active namenode goes down then the FSImage is copied from secondary name to active namenode.
So there will be no impact if secondary namenode goes down, it just that last edit log will not copied.
Yeah but if secondary namenode down for long time then namenode will restart to convert edit log into FSImage or it will wait for Secondary namenode to resume their services to covert edit log into FSImage.
 
Top