Steps:
1.Creating link for maprfs
2.Creating link for oracle
3.creating a job
4.Starting a job
5.Status of job.
Environment
mapr-sqoop2-server-2.0.0.201607271151-1.noarch
mapr-sqoop2-client-2.0.0.201607271151-1.noarch
[mapr@VM207 sqoop-2.0.0]$ ./bin/sqoop2-shell
sqoop:000> show link
0 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
+----+------+--------------+----------------+---------+
| Id | Name | Connector Id | Connector Name | Enabled |
+----+------+--------------+----------------+---------+
+----+------+--------------+----------------+---------+
sqoop:000> show connector
+----+------------------------+------------------+------------------------------------------------------
+----------------------+
| Id | Name | Version | Class | Supported Directions |
+----+------------------------+------------------+------------------------------------------------------
+----------------------+
| 1 | kite-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.kite.KiteConnector |
FROM/TO |
| 2 | kafka-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.kafka.KafkaConnector
| TO |
| 3 | hdfs-connector | 1.99.6-mapr-1607 | org.apache.sqoop.connector.hdfs.HdfsConnector |
FROM/TO |
| 4 | generic-jdbc-connector | 1.99.6-mapr-1607 |
org.apache.sqoop.connector.jdbc.GenericJdbcConnector | FROM/TO |
+----+------------------------+------------------+------------------------------------------------------
+----------------------+
sqoop:000>
1.Creating link for maprfs
sqoop:000> create link -c 3
Creating link for connector with id 3
Please fill following values to create new link object
Name: maprfs
Link configuration
HDFS URI: maprfs://10.10.72.207:7222
Hadoop conf directory: /opt/mapr/hadoop/hadoop-2.7.0/etc/hadoop
New link was successfully created with validation status OK and persistent id 2
2.Creating link for ORACLE JDBC conection
sqoop:000> create link -c 4
Creating link for connector with id 4
Please fill following values to create new link object
Name: oraclenew
Link configuration
JDBC Driver Class: oracle.jdbc.driver.OracleDriver
JDBC Connection String: jdbc:oracle:thin:@10.10.70.142:1521:VIJAYDB
Username: scott
Password: *****
JDBC Connection Properties:
There are currently 0 values in the map:
entry#
New link was successfully created with validation status OK and persistent id 5
sqoop:000> show link
+----+-----------+--------------+------------------------+---------+
| Id | Name | Connector Id | Connector Name | Enabled |
+----+-----------+--------------+------------------------+---------+
| 2 | maprfs | 3 | hdfs-connector | true |
| 5 | oraclenew | 4 | generic-jdbc-connector | true |
+----+-----------+--------------+------------------------+---------+
3.creating a job
sqoop:000> create job --from 5 --to 2
Creating job for links with from id 5 and to id 2
Please fill following values to create new job object
Name: newjob
From database configuration
Schema name:
Table name: TEST_TSS_ORDER_HEADERS_F_V
Table SQL statement:
Table column names:
Partition column name: ORDER_STATUS_LKP_KEY
Null value allowed for the partition column: false
Boundary query:
Incremental read
Check column:
Last value:
To HDFS configuration
Override null value:
Null value:
Output format:
0 : TEXT_FILE
1 : SEQUENCE_FILE
Choose: 0
Compression format:
0 : NONE
1 : DEFAULT
2 : DEFLATE
3 : GZIP
4 : BZIP2
5 : LZO
6 : LZ4
7 : SNAPPY
8 : CUSTOM
Choose: 0
Custom compression format:
Output directory: /user/mapr/ora4
Append mode:
Throttling resources
Extractors:
Loaders:
New job was successfully created with validation status OK and persistent id 11
sqoop:000> show job
+----+-----------+----------------+--------------+---------+
| Id | Name | From Connector | To Connector | Enabled |
+----+-----------+----------------+--------------+---------+
| 11 | newjob | 4 | 3 | true |
4.Starting a job
sqoop:000> start job -j 11
Submission details
Job ID: 11
Server URL: http://VM207:12000/sqoop/
Created by: mapr
Creation date: 2016-10-27 12:45:48 IST
Lastly updated by: mapr
External ID: job_1476676093162_0026
http://VM203:8088/proxy/application_1476676093162_0026/
Source Connector schema: Schema{name=TEST_TSS_ORDER_HEADERS_F_V,columns=[
Decimal{name=BL_ORDER_KEY,nullable=true,type=DECIMAL,precision=0,scale=-127},
Decimal{name=INCIDENT_ID,nullable=true,type=DECIMAL,precision=0,scale=-127},
Decimal{name=HEADER_ID,nullable=true,type=DECIMAL,precision=0,scale=-127},
Decimal{name=RMA_NUMBER,nullable=true,type=DECIMAL,precision=0,scale=-127},
Text{name=CUST_SERIAL_NUMBER,nullable=true,type=TEXT,charSize=null},
Date{name=ORDER_CREATION_DATE,nullable=true,type=DATE_TIME,hasFraction=true,hasTi
mezone=false},
Date{name=ORDER_CLOSE_DATE,nullable=true,type=DATE_TIME,hasFraction=true,hasTimez
one=false},
Decimal{name=ORDER_STATUS_LKP_KEY,nullable=true,type=DECIMAL,precision=0,scale=-
127}]}
2016-10-27 12:45:48 IST: BOOTING - Progress is not available
5.Status for Running job:
sqoop:000> status job -j 11
Submission details
Job ID: 11
Server URL: http://VM207:12000/sqoop/
Created by: mapr
Creation date: 2016-10-27 12:45:48 IST
Lastly updated by: mapr
External ID: job_1476676093162_0026
http://VM203:8088/proxy/application_1476676093162_0026/
2016-10-27 12:46:26 IST: RUNNING - 15.00 %
sqoop:000> status job -j 11
Submission details
Job ID: 11
Server URL: http://VM207:12000/sqoop/
Created by: mapr
Creation date: 2016-10-27 12:45:48 IST
Lastly updated by: mapr
External ID: job_1476676093162_0026
http://VM203:8088/proxy/application_1476676093162_0026/
2016-10-27 12:47:36 IST: RUNNING - 45.00 %
Check the loaded file in HDFS.
[root@VM204 hadoop]# hadoop fs -cat /user/mapr/ora4/0119d883-fd76-4431-af85-
cffa77f9c9e5.txt
3,3,3,3,'cust1','2014-07-03 06:13:00.000','2014-07-03 06:13:00.000',3
For reference:
https://github.com/splicemachine/mapr-hbase/tree/0.94.17-mapr
http://bruzah.blogspot.in/2012/02/java-google-protobufs-to-rpc-some-data.html
https://cwiki.apache.org/confluence/download/attachments/27362072/system_architecture.png?version=1&modificationDate=1414560669000&api=v2
http://hortonworks.com/blog/logparsing-with-cascading/
https://docs.google.com/document/d/12VBKeMgXKhWm0qIcRlrxpSJ-1GPUq463KlaWKrDB3qU/edit#heading=h.ljwprjkuevdk
No comments:
Post a Comment