DataX doriswriter
DataX doriswriter plug-in, used to synchronize data from other data sources to Doris through DataX.
The plug-in uses Doris’ Stream Load function to synchronize and import data. It needs to be used with DataX service.
About DataX
DataX is an open source version of Alibaba Cloud DataWorks data integration, an offline data synchronization tool/platform widely used in Alibaba Group. DataX implements efficient data synchronization functions between various heterogeneous data sources including MySQL, Oracle, SqlServer, Postgre, HDFS, Hive, ADS, HBase, TableStore (OTS), MaxCompute (ODPS), Hologres, DRDS, etc.
More details can be found at: https://github.com/alibaba/DataX/
Usage
The code of DataX doriswriter plug-in can be found here.
This directory is the doriswriter plug-in development environment of Alibaba DataX.
Because the doriswriter plug-in depends on some modules in the DataX code base, and these module dependencies are not submitted to the official Maven repository, when we develop the doriswriter plug-in, we need to download the complete DataX code base to facilitate our development and compilation of the doriswriter plug-in.
Directory structure
doriswriter/
This directory is the code directory of doriswriter, and this part of the code should be in the Doris code base.
The help doc can be found in
doriswriter/doc
init-env.sh
The script mainly performs the following steps:
Git clone the DataX code base to the local
Softlink the
doriswriter/
directory toDataX/doriswriter
.Add
<module>doriswriter</module>
to the originalDataX/pom.xml
Change httpclient version from 4.5 to 4.5.13 in DataX/core/pom.xml
httpclient v4.5 can not handle redirect 307 correctly.
After that, developers can enter
DataX/
for development. And the changes in theDataX/doriswriter
directory will be reflected in thedoriswriter/
directory, which is convenient for developers to submit code.
How to build
Run
init-env.sh
Modify code of doriswriter in
DataX/doriswriter
if you need.Build doriswriter
Build doriswriter along:
mvn clean install -pl plugin-rdbms-util,doriswriter -DskipTests
Build DataX:
mvn package assembly:assembly -Dmaven.test.skip=true
The output will be in
target/datax/datax/
.hdfsreader, hdfswriter and oscarwriter needs some extra jar packages. If you don’t need to use these components, you can comment out the corresponding module in DataX/pom.xml.
Compilation error
If you encounter the following compilation errors:
Could not find artifact com.alibaba.datax:datax-all:pom:0.0.1-SNAPSHOT ...
You can try the following solutions:
- Download alibaba-datax-maven-m2-20210928.tar.gz
- After decompression, copy the resulting
alibaba/datax/
directory to.m2/repository/com/alibaba/
corresponding to the maven used. - Try to compile again.
- Commit code of doriswriter in
doriswriter
if you need.
Example
1. Stream reads the data and imports it to Doris
For instructions on using the doriswriter plug-in, please refer to here.
2.Mysql reads the data and imports it to Doris
1.Mysql table structure
CREATE TABLE `t_test`(
`id`bigint(30) NOT NULL,
`order_code` varchar(30) DEFAULT NULL COMMENT '',
`line_code` varchar(30) DEFAULT NULL COMMENT '',
`remark` varchar(30) DEFAULT NULL COMMENT '',
`unit_no` varchar(30) DEFAULT NULL COMMENT '',
`unit_name` varchar(30) DEFAULT NULL COMMENT '',
`price` decimal(12,2) DEFAULT NULL COMMENT '',
PRIMARY KEY(`id`) USING BTREE
)ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 ROW_FORMAT=DYNAMIC COMMENT='';
2.Doris table structure
CREATE TABLE `ods_t_test` (
`id`bigint(30) NOT NULL,
`order_code` varchar(30) DEFAULT NULL COMMENT '',
`line_code` varchar(30) DEFAULT NULL COMMENT '',
`remark` varchar(30) DEFAULT NULL COMMENT '',
`unit_no` varchar(30) DEFAULT NULL COMMENT '',
`unit_name` varchar(30) DEFAULT NULL COMMENT '',
`price` decimal(12,2) DEFAULT NULL COMMENT ''
)ENGINE=OLAP
UNIQUE KEY(`id`, `order_code`)
DISTRIBUTED BY HASH(`order_code`) BUCKETS 1
PROPERTIES (
"replication_allocation" = "tag.location.default: 3",
"in_memory" = "false",
"storage_format" = "V2"
);
3.Create datax script
{
"job": {
"content": [
{
"reader": {
"name": "mysqlreader",
"parameter": {
"column": ["emp_no", "birth_date", "first_name","last_name","gender","hire_date"],
"connection": [
{
"jdbcUrl": ["jdbc:mysql://localhost:3306/demo"],
"table": ["employees_1"]
}
],
"username": "root",
"password": "xxxxx",
"where": ""
}
},
"writer": {
"name": "doriswriter",
"parameter": {
"loadUrl": ["172.16.0.13:8030"],
"loadProps": {
},
"column": ["emp_no", "birth_date", "first_name","last_name","gender","hire_date"],
"username": "root",
"password": "xxxxxx",
"postSql": ["select count(1) from all_employees_info"],
"preSql": [],
"flushInterval":30000,
"connection": [
{
"jdbcUrl": "jdbc:mysql://172.16.0.13:9030/demo",
"selectedDatabase": "demo",
"table": ["all_employees_info"]
}
],
"loadProps": {
"format": "json",
"strip_outer_array": true
}
}
}
}
],
"setting": {
"speed": {
"channel": "1"
}
}
}
}
4.Execute the datax task, refer to the specific datax official website