Skip to content

Commit d169355

Browse files
authored
update 1.5.0 doc (#774)
1 parent 0b4ee17 commit d169355

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

66 files changed

+262
-70
lines changed

docs/engine-usage/elasticsearch.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/elasticsearch/target/out/
5959

6060
Upload the engine plug-in package in 2.1 to the engine directory of the server
6161
```bash
62-
${LINKIS_HOME}/lib/linkis-engineplugins
62+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6363
```
6464
The directory structure after uploading is as follows
6565
```

docs/engine-usage/flink.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/flink/target/out/
5555

5656
Upload the engine plug-in package in 2.1 to the engine directory of the server
5757
```bash
58-
${LINKIS_HOME}/lib/linkis-engineplugins
58+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
5959
```
6060
The directory structure after uploading is as follows
6161
```

docs/engine-usage/impala.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/impala/target/out/
5454

5555
Upload the engine package in 2.1 to the engine directory of the server
5656
```bash
57-
${LINKIS_HOME}/lib/linkis-engineplugins
57+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
5858
```
5959
The directory structure after uploading is as follows
6060
```

docs/engine-usage/jdbc.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/
5656

5757
Upload the engine plug-in package in 2.1 to the engine directory of the server
5858
```bash
59-
${LINKIS_HOME}/lib/linkis-engineplugins
59+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6060
```
6161
The directory structure after uploading is as follows
6262
```

docs/engine-usage/openlookeng.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/openlookeng/target/out/
6363

6464
Upload the engine plug-in package in 2.1 to the engine directory of the server
6565
```bash
66-
${LINKIS_HOME}/lib/linkis-engineplugins
66+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6767
```
6868
The directory structure after uploading is as follows
6969
```

docs/engine-usage/pipeline.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/pipeline/target/out/
2727

2828
Upload the engine plug-in package in 1.1 to the engine directory of the server
2929
```bash
30-
${LINKIS_HOME}/lib/linkis-engineplugins
30+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
3131
```
3232
The directory structure after uploading is as follows
3333
```

docs/engine-usage/presto.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/presto/target/out/
6262

6363
Upload the engine package in 2.1 to the engine directory of the server
6464
```bash
65-
${LINKIS_HOME}/lib/linkis-engineplugins
65+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6666
```
6767
The directory structure after uploading is as follows
6868
```

docs/engine-usage/seatunnel.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/seatunnel/target/out/
6161

6262
Upload the engine package in 2.1 to the engine directory of the server
6363
```bash
64-
${LINKIS_HOME}/lib/linkis-engineplugins
64+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6565
```
6666
The directory structure after uploading is as follows
6767
```

docs/engine-usage/sqoop.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/sqoop/target/out/
6767

6868
Upload the engine package in 2.1 to the engine directory of the server
6969
```bash
70-
${LINKIS_HOME}/lib/linkis-engineplugins
70+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
7171
```
7272
The directory structure after uploading is as follows
7373
```

docs/engine-usage/trino.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ ${linkis_code_dir}/linkis-engineconn-plugins/trino/target/out/
6262

6363
Upload the engine package in 2.1 to the engine directory of the server
6464
```bash
65-
${LINKIS_HOME}/lib/linkis-engineplugins
65+
${LINKIS_HOME}/lib/linkis-engineconn-plugins
6666
```
6767
The directory structure after uploading is as follows
6868
```

download/release-notes-1.5.0.md

+97
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,97 @@
1+
---
2+
title: Release Notes 1.5.0
3+
sidebar_position: 0.13
4+
---
5+
6+
Apache Linkis 1.5.0 包括所有 [Project Linkis-1.5.0](https://github.com/apache/linkis/projects/27)
7+
8+
Linkis 1.5.0 版本,主要新增如下特性功能:注册中心支持切换为Nacos,新增Hbase引擎支持hbase-shell语法、支持图数据库Nebula引擎、新增repl解释器引擎支持运行Java和Scala代码片段、Spark 引擎支持On Yarn Cluster模式、Spark和Flink引擎支持 on k8s提交Jar任务等功能特性。
9+
10+
主要功能如下:
11+
12+
- 注册中心支持切换为Nacos
13+
- 新增Hbase引擎
14+
- 新增Nebula引擎
15+
- 新增Repl解释器引擎
16+
- 支持 Spark on k8s jar/py
17+
- 支持 Flink on k8s jar
18+
- Spark 引擎支持on yarn cluster
19+
- Linkis JDBC driver支持多引擎多版本选择
20+
- 基础数据管理增加配置项管理
21+
- 管理台增加运维工具,并新加入了用户配置管理
22+
- Entrance支持任务接管(实验性)
23+
24+
25+
缩写:
26+
- COMMON: Linkis Common
27+
- ENTRANCE: Linkis Entrance
28+
- EC: Engineconn
29+
- ECM: EngineConnManager
30+
- ECP: EngineConnPlugin
31+
- DMS: Data Source Manager Service
32+
- MDS: MetaData Manager Service
33+
- LM: Linkis Manager
34+
- PS: Linkis Public Service
35+
- PE: Linkis Public Enhancement
36+
- RPC: Linkis Common RPC
37+
- CG: Linkis Computation Governance
38+
- DEPLOY: Linkis Deployment
39+
- WEB: Linkis Web
40+
- GATEWAY: Linkis Gateway
41+
- EP: Engine Plugin
42+
- - ORCHESTRATOR: Linkis Orchestrator
43+
44+
45+
## 新特性
46+
- \[EC][LINKIS-5008](https://github.com/apache/linkis/pull/5008) 注册中心支持Nacos
47+
- \[GATEWAY][LINKIS-4992](https://github.com/apache/linkis/pull/4992) Gateway支持访问控制配置
48+
- \[EC-REPL][LINKIS-4940](https://github.com/apache/linkis/pull/4940) 新增Repl解释器引擎,支持执行scala和Java代码
49+
- \[EC-NEBULA][LINKIS-4903](https://github.com/apache/linkis/pull/4903) 新增nebula引擎
50+
- \[EC-HBASE][LINKIS-4891](https://github.com/apache/linkis/pull/4891) 新增Hbase引擎
51+
- \[EC-SPARK][LINKIS-4850](https://github.com/apache/linkis/pull/4850) Spark 支持On Yarn Cluster运行
52+
- \[EC-Spark][LINKIS-4867](https://github.com/apache/linkis/pull/4867) Spark支持提交Jar任务on k8s
53+
- \[EC-Spark][LINKIS-4906](https://github.com/apache/linkis/pull/4906) Spark支持提交pyspark job on k8s
54+
- \[JDBC-DRIVER][LINKIS-4930](https://github.com/apache/linkis/pull/4930) JDBCDriver支持引擎多版本
55+
- \[EC-FLINK][LINKIS-4753](https://github.com/apache/linkis/pull/4753) Flink升级为1.16.2并兼容多版本
56+
- \[ENTRANCE[LINKIS-4282](https://github.com/apache/linkis/pull/4282) 实验性:Entrance服务支持HA
57+
- \[MONITOR][LINKIS-4905](https://github.com/apache/linkis/pull/4905) 实验性:新增Linkis Monitor服务
58+
- \[WEB][LINKIS-4940](https://github.com/apache/linkis/pull/4940) 实验性:管理台前端新架构升级
59+
60+
61+
## 增强点
62+
- \[ECM][LINKIS-4990](https://github.com/apache/linkis/pull/4990) 支持管理台下载EC的日志文件
63+
- \[EC][LINKIS-4982](https://github.com/apache/linkis/pull/4982) EC指标丰富增加锁空闲等时间指标
64+
- \[WEB][LINKIS-4954](https://github.com/apache/linkis/pull/4954) 管理台增加用户配置管理页面
65+
- \[EC-SPARk][LINKIS-4961](https://github.com/apache/linkis/pull/4961) Pyspark 新增更多的默认class import
66+
- \[EC-PYTHON][LINKIS-4835](https://github.com/apache/linkis/pull/4835) 优化打印Python引擎,错误信息打印优化
67+
- \[EC-SPARK][LINKIS-4896](https://github.com/apache/linkis/pull/4896) Spark Once 任务支持EngineConnRuntimeMode标签配置
68+
- \[LINKISManager][LINKIS-4914](https://github.com/apache/linkis/pull/4914) LinkisManager资源排序选择规则优化为从按大到小
69+
- \[WEB][LINKIS-4935](https://github.com/apache/linkis/pull/4935) 管理台支持配置spark.conf参数,支持配置多个Spark的原生参数
70+
- \[EC][LINKIS-4714](https://github.com/apache/linkis/pull/4714) EC 支持指定Debug端口范围
71+
- \[EC-FLINK][LINKIS-5023](https://github.com/apache/linkis/pull/5023) Flink引擎支持读取用户自定义的log4j配置
72+
- \[PES][LINKIS-4838](https://github.com/apache/linkis/pull/4838) 文件读取和写入接口支持更多参数和操作
73+
- \[LINKISManager][LINKIS-4850](https://github.com/apache/linkis/pull/4852) LinkisManager支持管理K8s资源
74+
- \[PE][LINKIS-4847](https://github.com/apache/linkis/pull/4847) 优化模块数合并公共数据源模块
75+
- \[PE][LINKIS-4853](https://github.com/apache/linkis/pull/4853) 优化模块数合并公共客户端模块为pes-client模块
76+
- \[PE][LINKIS-4854](https://github.com/apache/linkis/pull/4854) 优化模块数合并公共服务的多个模块
77+
- \[PE][LINKIS-4934](https://github.com/apache/linkis/pull/4934) 数据源服务支持合并和单独部署
78+
- \[EC-FLINK][LINKIS-5025](https://github.com/apache/linkis/pull/5025) Flink 支持加载默认配置
79+
- \[EC-JDBC][LINKIS-5007](https://github.com/apache/linkis/pull/5007) JDBC支持多任务间的任务串联
80+
81+
82+
## 修复功能
83+
- \[EC-Flink][LINKIS-5041](https://github.com/apache/linkis/pull/5041) 修复交互式Flink SQL打印状态获取错误日志问题
84+
- \[MDS][LINKIS-4998](https://github.com/apache/linkis/issues/4998) ES数据源兼容6.X和7.X
85+
- \[EC-Spark][LINKIS-4996](https://github.com/apache/linkis/pull/4996) Spark Scala任务支持将错误信息打印到任务日志中
86+
- \[ENTRANCE][LINKIS-4967](https://github.com/apache/linkis/pull/4967) 修复SQL解析comment中带有分号导致的解析错误
87+
- \[EC][LINKIS-4920](https://github.com/apache/linkis/pull/4920) 修复结果集null返回为字符串NULL问题
88+
- \[CLI][LINKIS-4919](https://github.com/apache/linkis/pull/4919) Client修复并发的NPE问题
89+
- \[CG][LINKIS-4915](https://github.com/apache/linkis/pull/4915) 修复LinkisManager的ECM选择逻辑存在错误选择负载高的问题
90+
- \[LM][LINKIS-4860](https://github.com/apache/linkis/pull/4860) 修复linkis-httpclient post接口中文乱码问题
91+
- \[LM][LINKIS-4771](https://github.com/apache/linkis/pull/4771) Linkis-Cli once job提交应该为Once模式
92+
- \[LM][LINKIS-4731](https://github.com/apache/linkis/pull/4731) Kill EC脚本应该去跳过ECM的进程
93+
94+
## 致谢
95+
Apache Linkis 1.5.0 的发布离不开 Linkis 社区的贡献者,感谢所有的社区贡献者,包括但不仅限于以下 Contributors(排名不发先后):Casion,ChengJie1053,CoderSerio,GuoPhilipse,LiuGuoHua,Yonghao,ZHANG,Zhen,aiceflower,chengrui1,dependabot,guoshupei,[email protected],peacewong,peter.peng,sjgllgh,v-kkhuang,weixiao,yangwenzea,yijut2,zhangwejun,zhaowenkai111,zlucelia,赵文恺,jackxu2011。
96+
97+

faq/main.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -87,9 +87,9 @@ A: This should be caused by repeated installations, resulting in the result set
8787
## Q3: The json4s package conflict caused by inconsistent Spark versions, the error is as follows: Error message: caused by: java.lang.NoSuchMethodError: org.json4s.jackson.jsonMethod$
8888

8989
solution:
90-
This is because of Spark jars' json4s and lib/linkis-engineplugins/spark/dist/version/lib
90+
This is because of Spark jars' json4s and lib/linkis-engineconn-plugins/spark/dist/version/lib
9191
The json4s version in the package is inconsistent. When the official release is released, the supported version of Spark will be indicated later. If it is inconsistent, this problem will exist.
92-
The solution is to replace the json4s package in Spark jars with lib/linkis-engineplugins/spark/dist/version/lib
92+
The solution is to replace the json4s package in Spark jars with lib/linkis-engineconn-plugins/spark/dist/version/lib
9393
The json4s version inside the package. In addition, there may be conflicts in the netty package, which can be handled according to the method of Json4s. Then restart the ecp service: sh sbin/linkis-damon.sh restart cg-engineplugin
9494

9595
## Q4: When Linkis1.X submits spark sql tasks in version CDH5.16.1, how to troubleshoot 404 problems
@@ -210,7 +210,7 @@ In the front end - management console - settings - general settings - Yarn queue
210210

211211
solution:
212212
This is because the hive transaction is enabled, you can modify the hive-site.xml on the linkis machine to turn off the transaction configuration, refer to the hive transaction: https://www.jianshu.com/p/aa0f0fdd234c
213-
Or put the relevant package into the engine plugin directory lib/linkis-engineplugins/hive/dist/version/lib
213+
Or put the relevant package into the engine plugin directory lib/linkis-engineconn-plugins/hive/dist/version/lib
214214

215215

216216

@@ -403,7 +403,7 @@ This is because the instance is forcibly shut down, but the persistence in the d
403403
![](/faq/q43_1.png)
404404

405405
① The parameter configuration of the management console can correspond to the engine parameters, and the timeout time can be modified. After saving, kill the existing engine.
406-
②If the timeout configuration is not displayed, you need to manually modify the linkis-engineplugins directory, the corresponding engine plugin directory such as spark/dist/v2.4.3/conf/linkis-engineconn.properties, the default configuration is wds.linkis.engineconn.max.free.time =1h, means 1h overtime, and can have units m and h. 0 means no timeout, no automatic kill. After the change, you need to restart ecp, and kill the existing engine, and run a new task to start the engine to take effect.
406+
②If the timeout configuration is not displayed, you need to manually modify the linkis-engineconn-plugins directory, the corresponding engine plugin directory such as spark/dist/v2.4.3/conf/linkis-engineconn.properties, the default configuration is wds.linkis.engineconn.max.free.time =1h, means 1h overtime, and can have units m and h. 0 means no timeout, no automatic kill. After the change, you need to restart ecp, and kill the existing engine, and run a new task to start the engine to take effect.
407407

408408
## Q22: When creating a new workflow, it prompts "504 Gateway Time-out"
409409

0 commit comments

Comments
 (0)