Linkis Task submission and execution Rest API document
- The return of the Linkis Restful interface follows the following standard return format:
{
"method": "",
"status": 0,
"message": "",
"data": {}
}
Convention:
- method: Returns the requested Restful API URI, which is mainly used in WebSocket mode.
- status: return status information, where: -1 means no login, 0 means success, 1 means error, 2 means verification failed, 3 means no access to the interface.
- data: return specific data.
- message: return the requested prompt message. If the status is not 0, the message returned is an error message, and the data may have a stack field, which returns specific stack information.
For more information about the Linkis Restful interface specification, please refer to: Linkis Restful Interface Specification
1. Submit task
Interface
/api/rest_j/v1/entrance/submit
Submission method
POST
Request Parameters
{
"executionContent": {
"code": "show tables",
"runType": "sql"
},
"params": {
"variable": {// task variable
"testvar": "hello"
},
"configuration": {
"runtime": {// task runtime params
"jdbc.url": "XX"
},
"startup": { // ec start up params
"spark.executor.cores": "4"
}
}
},
"source": { //task source information
"scriptPath": "file:///tmp/hadoop/test.sql"
},
"labels": {
"engineType": "spark-2.4.3",
"userCreator": "hadoop-IDE"
}
}
-Sample Response
{
"method": "/api/rest_j/v1/entrance/submit",
"status": 0,
"message": "Request executed successfully",
"data": {
"execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
"taskID": "123"
}
}
execID is the unique identification execution ID generated for the task after the user task is submitted to Linkis. It is of type String. This ID is only useful when the task is running, similar to the concept of PID. The design of ExecID is
(requestApplicationName length)(executeAppName length)(Instance length)${requestApplicationName}${executeApplicationName}${entranceInstance information ip+port}${requestApplicationName}_${umUser}_${index}
taskID is the unique ID that represents the task submitted by the user. This ID is generated by the database self-increment and is of Long type
2. Get Status
Interface
/api/rest_j/v1/entrance/${execID}/status
Submission method
GET
Sample Response
{
"method": "/api/rest_j/v1/entrance/{execID}/status",
"status": 0,
"message": "Get status successful",
"data": {
"execID": "${execID}",
"status": "Running"
}
}
3. Get Logs
Interface
/api/rest_j/v1/entrance/${execID}/log?fromLine=${fromLine}&size=${size}
Submission method
GET
The request parameter fromLine refers to the number of lines from which to get, and size refers to the number of lines of logs that this request gets
Sample Response, where the returned fromLine needs to be used as a parameter for the next request of this interface
{
"method": "/api/rest_j/v1/entrance/${execID}/log",
"status": 0,
"message": "Return log information",
"data": {
"execID": "${execID}",
"log": ["error log","warn log","info log", "all log"],
"fromLine": 56
}
}
4. Get Progress and resource
Interface
/api/rest_j/v1/entrance/${execID}/progressWithResource
Submission method
GET
Sample Response
{
"method": "/api/entrance/exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2/progressWithResource",
"status": 0,
"message": "OK",
"data": {
"yarnMetrics": {
"yarnResource": [
{
"queueMemory": 9663676416,
"queueCores": 6,
"queueInstances": 0,
"jobStatus": "COMPLETED",
"applicationId": "application_1655364300926_69504",
"queue": "default"
}
],
"memoryPercent": 0.009,
"memoryRGB": "green",
"coreRGB": "green",
"corePercent": 0.02
},
"progress": 0.5,
"progressInfo": [
{
"succeedTasks": 4,
"failedTasks": 0,
"id": "jobId-1(linkis-spark-mix-code-1946915)",
"totalTasks": 6,
"runningTasks": 0
}
],
"execID": "exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2"
}
}
5. Kill Task
Interface
/api/rest_j/v1/entrance/${execID}/kill
Submission method
POST
Sample Response
{
"method": "/api/rest_j/v1/entrance/{execID}/kill",
"status": 0,
"message": "OK",
"data": {
"execID":"${execID}"
}
}
6. Get task info
Interface
/api/rest_j/v1/jobhistory/{id}/get
Submission method
GET
Request Parameters:
Parameter name | Parameter description | Request type | Required | Data type | schema |
---|---|---|---|---|---|
id | task id | path | true | string |
- Sample Response
{
"method": null,
"status": 0,
"message": "OK",
"data": {
"task": {
"taskID": 1,
"instance": "xxx",
"execId": "exec-id-xxx",
"umUser": "test",
"engineInstance": "xxx",
"progress": "10%",
"logPath": "hdfs://xxx/xxx/xxx",
"resultLocation": "hdfs://xxx/xxx/xxx",
"status": "FAILED",
"createdTime": "2019-01-01 00:00:00",
"updatedTime": "2019-01-01 01:00:00",
"engineType": "spark",
"errorCode": 100,
"errDesc": "Task Failed with error code 100",
"executeApplicationName": "hello world",
"requestApplicationName": "hello world",
"runType": "xxx",
"paramJson": "{\"xxx\":\"xxx\"}",
"costTime": 10000,
"strongerExecId": "execId-xxx",
"sourceJson": "{\"xxx\":\"xxx\"}"
}
}
}
7. Get result set info
Support for multiple result sets
Interface
/api/rest_j/v1/filesystem/getDirFileTrees
Submission method
GET
Request Parameters:
Parameter name | Parameter description | Request type | Required | Data type | schema |
---|---|---|---|---|---|
path | result directory | query | true | string |
- Sample Response
{
"method": "/api/filesystem/getDirFileTrees",
"status": 0,
"message": "OK",
"data": {
"dirFileTrees": {
"name": "1946923",
"path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923",
"properties": null,
"children": [
{
"name": "_0.dolphin",
"path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923/_0.dolphin",//result set 1
"properties": {
"size": "7900",
"modifytime": "1657113288360"
},
"children": null,
"isLeaf": true,
"parentPath": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923"
},
{
"name": "_1.dolphin",
"path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923/_1.dolphin",//result set 2
"properties": {
"size": "7900",
"modifytime": "1657113288614"
},
"children": null,
"isLeaf": true,
"parentPath": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923"
}
],
"isLeaf": false,
"parentPath": null
}
}
}
8. Get result content
Interface
/api/rest_j/v1/filesystem/openFile
Submission method
GET
Request Parameters:
Parameter name | Parameter description | Request type | Required | Data type | schema |
---|---|---|---|---|---|
path | result path | query | true | string | |
charset | Charset | query | false | string | |
page | page number | query | false | ref | |
pageSize | page size | query | false | ref |
- Sample Response
{
"method": "/api/filesystem/openFile",
"status": 0,
"message": "OK",
"data": {
"metadata": [
{
"columnName": "count(1)",
"comment": "NULL",
"dataType": "long"
}
],
"totalPage": 0,
"totalLine": 1,
"page": 1,
"type": "2",
"fileContent": [
[
"28"
]
]
}
}
9. Get Result by stream
Get the result as a CSV or Excel file
Interface
/api/rest_j/v1/filesystem/resultsetToExcel
Submission method
GET
Request Parameters:
Parameter name | Parameter description | Request type | Required | Data type | schema |
---|---|---|---|---|---|
autoFormat | Auto | query | false | boolean | |
charset | charset | query | false | string | |
csvSeerator | csv Separator | query | false | string | |
limit | row limit | query | false | ref | |
nullValue | null value | query | false | string | |
outputFileName | Output file name | query | false | string | |
outputFileType | Output file type csv or excel | query | false | string | |
path | result path | query | false | string | |
quoteRetouchEnable | Whether to quote modification | query | false | boolean | |
sheetName | sheet name | query | false | string |
- Response
binary stream
10. Compatible with 0.x task submission interface
Interface
/api/rest_j/v1/entrance/execute
Submission method
POST
Request Parameters
{
"executeApplicationName": "hive", //Engine type
"requestApplicationName": "dss", //Client service type
"executionCode": "show tables",
"params": {
"variable": {// task variable
"testvar": "hello"
},
"configuration": {
"runtime": {// task runtime params
"jdbc.url": "XX"
},
"startup": { // ec start up params
"spark.executor.cores": "4"
}
}
},
"source": { //task source information
"scriptPath": "file:///tmp/hadoop/test.sql"
},
"labels": {
"engineType": "spark-2.4.3",
"userCreator": "hadoop-IDE"
},
"runType": "hql", //The type of script to run
"source": {"scriptPath":"file:///tmp/hadoop/1.hql"}
}
- Sample Response
{
"method": "/api/rest_j/v1/entrance/execute",
"status": 0,
"message": "Request executed successfully",
"data": {
"execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
"taskID": "123"
}
}