The Apache Sqoop project was retired in June 2021 and moved to the Apache Attic.[2]
Description
Sqoop supports incremental loads of a single table or a free form SQL query as well as saved jobs which can be run multiple times to import updates made to a database since the last import. Imports can also be used to populate tables in Hive or HBase.[3] Exports can be used to put data from Hadoop into a relational database. Sqoop got the name from "SQL-to-Hadoop".[4]
Sqoop became a top-level Apache project in March 2012.[5]
^"Sqoop Import". Pentaho. 2015-12-10. Archived from the original on 2015-12-10. Retrieved 2015-12-10. The Sqoop Import job allows you to import data from a relational database into the Hadoop Distributed File System (HDFS) using Apache Sqoop.
^"Sqoop Export". Pentaho. 2015-12-10. Archived from the original on 2015-12-10. Retrieved 2015-12-10. The Sqoop Export job allows you to export data from Hadoop into an RDBMS using Apache Sqoop.