As you are trying to load incremental data into hdfs, the sqoop command would be as follows:
Command: sqoop import --connect jdbc:mysql://localhost/edurekadb \--username root --table employee \--target-dir '/user/edureka/sqoop_warehouse/employee' --check-column 'last_update_at' \--incremental append --last-value xxxxBrief description of parameters required for loading incremental data into HDFS.--incremental as we want to import only new rows without changing the existing ones, we need to use the append mode.--check-column a column name that should be checked for newly appended data. Usually it would be the column which acts as primary key in the table.--last-value contains the last value that successfully imported into Hadoop.I would suggest you to go through the document in the attachment that would guide you about loading incremental data using sqoop in detail with an example and try it on your end.Please try it and let me know if this helps you.If you are still facing the issue, please try by restarting the Edureka VM and check.
Sqoop job error Print
Modified on: Wed, 6 Jul, 2016 at 12:44 PM
Did you find it helpful? Yes No
Send feedbackSorry we couldn't be helpful. Help us improve this article with your feedback.