Giter VIP home page Giter VIP logo

dtstack / chengying Goto Github PK

View Code? Open in Web Editor NEW
189.0 189.0 66.0 45.99 MB

一款支持标准化schema定义、自动化部署产品包的软件,旨在对产品包下每个服务进行部署、升级、卸载、配置等操作,解放人工运维成本。

License: Apache License 2.0

Makefile 0.06% Go 53.05% Dockerfile 0.08% Shell 4.09% PowerShell 0.31% JavaScript 6.55% TypeScript 31.59% CSS 0.10% HTML 0.18% SCSS 1.95% XSLT 0.02% Batchfile 0.11% PLpgSQL 1.91%

chengying's People

Contributors

lijiangbo avatar wangqi811 avatar zaoei avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chengying's Issues

您好!请问我的agent为什么一直报错,添加主机页面无显示,数据库中有添加的主机信息,sid列为空。

      • chengying-server端日志 * * *
        MATRIX-DEBUG:2022/11/25 21:04:55 cluster.go:2192: [Cluster->GetHostClusterList] GetHostClusterList from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:04:55 cluster.go:2771: HostGroups: /api/v2/cluster/hostgroups?type=hosts&id=23
        MATRIX-DEBUG:2022/11/25 21:04:58 cluster.go:2771: HostGroups: /api/v2/cluster/hostgroups?type=hosts&id=23
        MATRIX-DEBUG:2022/11/25 21:04:58 cluster.go:2192: [Cluster->GetHostClusterList] GetHostClusterList from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:04:59 agent.go:224: [CheckPwdConnect] check ssh connect by user password from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:04:59 agent.go:129: [skip exist cluster]%!(EXTRA *errors.errorString=sql: no rows in result set)
        MATRIX-DEBUG:2022/11/25 21:05:01 cluster.go:2192: [Cluster->GetHostClusterList] GetHostClusterList from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:05:01 cluster.go:2771: HostGroups: /api/v2/cluster/hostgroups?type=hosts&id=23
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:284: [AgentInstallByPwd] install agent by user password from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:299: * * * AgentInstallByPwd * * * ctx -> &{0xc001815380 0xc001864b00 {[]} [{userId 2 false} {username [email protected] false} {expiration {0 63805237107 0x3d7e0c0} false}] 0xc0018d84d0 [0x9b28a0] 0}
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:300: * * * AgentInstallByPwd * * * params -> &{{172.21.0.15 22 root Datasphere#12345! } 23 hosts }
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:327: * * * execAgentInstallByPwd * * * info -> &{0 0 0 0 {0 0 } {0 0 } { false}}
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:373: * * * execAgentInstallByPwd * * * 373
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:118: GetAgentInstallCmd: curl -s 'http://127.0.0.1:8889/api/v1/deploy/sidecar/install/shell?TargetPath=/opt/dtstack/easymanager/easyagent&CallBack=aHR0cDovLzgyLjE1Ny4yNi4xNTI6ODA5OS9hcGkvdjIvYWdlbnQvaW5zdGFsbC9jYWxsYmFjaz9haWQ9MTk=&Type=hosts&ClusterId=23&Roles=' | sh
        MATRIX-DEBUG:2022/11/25 21:05:03 agent.go:399: * * * execAgentInstallByPwd * * * 399
        MATRIX-DEBUG:2022/11/25 21:05:04 cluster.go:2771: HostGroups: /api/v2/cluster/hostgroups?type=hosts&id=23
        MATRIX-DEBUG:2022/11/25 21:05:04 cluster.go:2192: [Cluster->GetHostClusterList] GetHostClusterList from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:05:07 cluster.go:2192: [Cluster->GetHostClusterList] GetHostClusterList from EasyMatrix API
        MATRIX-DEBUG:2022/11/25 21:05:07 cluster.go:2771: HostGroups: /api/v2/cluster/hostgroups?type=hosts&id=23
        MATRIX-DEBUG:2022/11/25 21:05:10 agent.go:435: * * * AgentInstallByPwd * * * resBody -> map[result:[+] Installing agent (easyagent sidecar)...
        Stopping easyagent-sidecar
        27370
        Killing easyagent-sidecar (pid 27370) with SIGTERM
        Waiting easyagent-sidecar (pid 27370) to die...
        Waiting easyagent-sidecar (pid 27370) to die...
        easyagent-sidecar stopped stopped
        [+] config EasyManage agent (easyagent sidecar)...
        [+] easyagent-sidecar is already installed...
        [+] setting EasyManage agent (easyagent sidecar)...
        Starting easyagent-sidecar
        started
        start easyagent success! ...
        ]

==> easyagent-server-error.log <==
EA-SERVER-ERROR:2022/11/25 21:05:03 local-set.go:47: [InitInstallSidecarShWithLocalFile] easyagent_install.sh, err: stat /tmp/go-build116945771/b001/exe/easyagent_install.sh: no such file or directory
EA-SERVER-ERROR:2022/11/25 21:05:03 local-set.go:47: [InitInstallSidecarShWithLocalFile] easyagent_install_4win.ps1, err: stat /tmp/go-build116945771/b001/exe/easyagent_install_4win.ps1: no such file or directory

==> easyagent-server.log <==
EA-SERVER-DEBUG:2022/11/25 21:05:03 local-set.go:31: * * * [SetAssetWithLocalFile] * * * os.Args[0] -> /tmp/go-build116945771/b001/exe/main
EA-SERVER-DEBUG:2022/11/25 21:05:03 local-set.go:31: * * * [SetAssetWithLocalFile] * * * os.Args[0] -> /tmp/go-build116945771/b001/exe/main

==> easyagent-server-error.log <==

==> agent-install.log <==
agent-install2022/11/25 21:05:03 deploy.go:96: 2022-11-25 21:05:03.605550 [INSTALL] InitSidecarInstallSh success: easyagent_install.sh

==> easyagent-server.log <==
EA-SERVER-DEBUG:2022/11/25 21:05:03 deploy.go:101: * * * GetSidecarInstallShell * * * ctx -> &{0xc0002ec510 0xc0004b0400 {[]} [] 0xc00046c000 [0x924ae0] 0}
EA-SERVER-DEBUG:2022/11/25 21:05:10 client.go:335: * * * RunSync * * * out -> [+] Installing agent (easyagent sidecar)...
Stopping easyagent-sidecar
27370
Killing easyagent-sidecar (pid 27370) with SIGTERM
Waiting easyagent-sidecar (pid 27370) to die...
Waiting easyagent-sidecar (pid 27370) to die...
easyagent-sidecar stopped stopped
[+] config EasyManage agent (easyagent sidecar)...
[+] easyagent-sidecar is already installed...
[+] setting EasyManage agent (easyagent sidecar)...
Starting easyagent-sidecar
started
start easyagent success! ...
EA-SERVER-DEBUG:2022/11/25 21:05:10 client.go:336: * * * RunSync * * * err ->
EA-SERVER-DEBUG:2022/11/25 21:05:10 client.go:337: * * * RunSync * * * cmd -> curl -s 'http://127.0.0.1:8889/api/v1/deploy/sidecar/install/shell?TargetPath=/opt/dtstack/easymanager/easyagent&CallBack=aHR0cDovLzgyLjE1Ny4yNi4xNTI6ODA5OS9hcGkvdjIvYWdlbnQvaW5zdGFsbC9jYWxsYmFjaz9haWQ9MTk=&Type=hosts&ClusterId=23&Roles=' | sh
EA-SERVER-DEBUG:2022/11/25 21:05:10 client.go:342: RunSync cmd output::[+] Installing agent (easyagent sidecar)...
Stopping easyagent-sidecar
27370
Killing easyagent-sidecar (pid 27370) with SIGTERM
Waiting easyagent-sidecar (pid 27370) to die...
Waiting easyagent-sidecar (pid 27370) to die...
easyagent-sidecar stopped stopped
[+] config EasyManage agent (easyagent sidecar)...
[+] easyagent-sidecar is already installed...
[+] setting EasyManage agent (easyagent sidecar)...
Starting easyagent-sidecar
started
start easyagent success! ...


      • agent-sidecar日志* * *
        AGENT-ERROR:2022/11/25 21:05:01 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
        AGENT-ERROR:2022/11/25 21:05:03 main.go:169: quit according to signal 'terminated'
        AGENT-ERROR:2022/11/25 21:05:03 event.go:66: ReportEvent error: not register sidecar

==> agent.log <==
AGENT-DEBUG:2022/11/25 21:05:03 tc_linux.go:49: delete old tc configuration
AGENT-DEBUG:2022/11/25 21:05:10 tc_linux.go:79: initialize tc configuration
AGENT-DEBUG:2022/11/25 21:05:10 tc_linux.go:49: delete old tc configuration
AGENT-DEBUG:2022/11/25 21:05:10 monitor.go:214: start collectSystemMetrics...
AGENT-DEBUG:2022/11/25 21:05:10 monitor.go:64: start collectMetrics PID: 29420...

==> agent-error.log <==
AGENT-ERROR:2022/11/25 21:05:10 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:13 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:16 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:19 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:22 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:25 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:28 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:31 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:34 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:37 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:40 event.go:66: ReportEvent error: not register sidecar
AGENT-ERROR:2022/11/25 21:05:40 event.go:66: ReportEvent error: not register sidecar
AGENT-ERROR:2022/11/25 21:05:40 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:43 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:46 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:49 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:52 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:55 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:05:58 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:06:01 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:06:04 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:06:07 rigister.go:128: RegisterSidecar error: rpc error: code = Unknown desc = sql: no rows in result set
AGENT-ERROR:2022/11/25 21:06:10 event.go:66: ReportEvent error: not register sidecar
AGENT-ERROR:2022/11/25 21:06:10 event.go:66: ReportEvent error: not register sidecar


二维码失效了

Describe the bug
钉钉二维码失效了

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

添加主机时IP地址处理bug

版本号:1.1.1
操作系统:centos7.9
image
在添加主机时,主机列表上提示可以使用类型172.16.0.10-15这种方式。
实际这种方式无法添加,在测试连通性时会把短横线分开,一个个的测试。
结果在确定时,直接把172.16.0.10-15当成一个整体进行agent安装导致无法安装。
不使用172.16.0.10-15这个短横线连接形式是可以添加的。

config_paths挂载文件前台无法显示内容

在schema里面设置了config_paths 文件路径,前台选择这个文件无法显示出来,显示『无』
我理解这个功能是提供一个前台可修改挂载配置文件的地方,但是我选择文件后就提示无,我不知道问题在哪里,
image

下面是我的schema配置文件

parent_product_name: Base
product_name: ActiveMQ
product_name_display: ActiveMQ
product_version: 5.18.0-arm-1
service:
ActiveMQ:
service_display: ActiveMQ
version: 5.18.0
instance:
cmd: ./bin/start.sh ${api_port} ${api_user} ${api_password} ${web_port} ${web_user} ${web_password} ${disallow_web}
post_deploy: ./bin/post_deploy.sh
config_paths:
- conf/activemq.xml
- conf/users.properties
logs:
- data/activemq.log
update_recreate: true
prometheus_port: 9104
config:
api_port: 61616
api_user: admin
api_password: 'admin@123'
web_port: 8161
web_user: admin
web_password: 'admin@123'
disallow_web: "false"

无法添加主机

在主机集群中,添加1个或者多个主机,在测试连通性的时候,提示成功,但点击确定以后,主机列表为空,保存集群以后也看不到主机。

目前尝试了重装和新建集群重新添加主机,均失败。

请问如何排查日志。

[root@jumpserver easymanager]# docker images | grep dtopensource
dtopensource/manage-sql                           1.0                 38ab5810c6f9        3 weeks ago         451MB
dtopensource/ntpd                                 1.0                 e0b54f9a9957        3 weeks ago         228MB
dtopensource/prometheus                           1.0                 0f11c6aa3a32        3 weeks ago         123MB
dtopensource/alertmanager                         1.0                 4619e38a08c2        3 weeks ago         31.9MB
dtopensource/pushgateway                          1.0                 87963e37ea4a        3 weeks ago         16.4MB
dtopensource/matrix                               1.0                 b24f8f3599d4        3 weeks ago         109MB
dtopensource/easy-agent-server                    1.0                 30e5b524c4f9        3 weeks ago         50.5MB
dtopensource/grafana                              1.0                 d4982fb4dc57        3 weeks ago         277MB
dtopensource/dt-alert                             1.0                 cd92281da460        3 weeks ago         163MB
dtopensource/manage-front                         1.0                 d2b22715df2c        3 weeks ago         42.8MB

部署Hive失败

image

部署时,hive实例需要做如下修改: 将mysql_path: "/opt/dtstack/DTBase/mysql/bin"修改为 mysql_path: "/opt/dtstack/Mysql/mysql/bin"

SQL注入问题

延时注入触发成功。
'and(select*from(select+sleep(15))a//union//select+1)='
dingtalkgov_qt_clipbord_pic_4

[部署中心][集群详情][集群命令]无法展示产品包部署的真实日志。

Describe the bug
A clear and concise description of what the bug is.
点击[部署中心][集群详情][集群命令],查看部署日志时,显示:stat /matrix/easyagent/shell_history/5a7c0586-c3be-46b0-a9fd-cb328852b3dd/2023-04-24/0/shell.log: no such file or directory。

Expected behavior
A clear and concise description of what you expected to happen.
期望能看到具体的部署日志,失败和成功的都能看到。

Screenshots
If applicable, add screenshots to help explain your problem.
集群命令显示窗口

Desktop (please complete the following information):

  • OS: [win11]
  • Browser [搜狗浏览器、谷歌浏览器]
  • Version [稳定版]

Additional context
【部署版本1.1.1】

部署摘要: agent异常退出:agent run error(unexpected):stop supervisor: e8771b72-8e28-49d8-ab25-3b9ff4cbf37b

项目打包部署过程中出现如下的错误提示:
部署摘要: agent异常退出:agent run error(unexpected):stop supervisor: e8771b72-8e28-49d8-ab25-3b9ff4cbf37b

schema.yml内容如下:

parent_product_name: top-lab
product_name: top-lab
product_name_display: top-lab
product_version: 1.0.0

service:

  TopLab:
    service_display: top-lab
    version: 1.0.1
    group: top-ai
    config:
      service_port: 8097
      self_ip: ${@TopLab}
      top_lab_ip_port: ${self_ip}:${service_port} #self service's node ip
      # java_opts: "-Xms256m -Xmx1024m -Dundertow.port=${service_port} -Dundertow.host=0.0.0.0"

    instance:
      cmd: ./start.sh start
      update_recreate: true
#      environment:
#        JAVA_OPTS: ${java_opts}
#      config_paths:
#      - config/jboot.properties
#      - config/jboot-dev.properties
      healthcheck:
        shell: curl http://${top_lab_ip_port}/healthcheck
        #period: 30s #default 60s
        start_period: 30s #default 10s
        timeout: 10s #default 10s
        retries: 3 #default 1
      max_replica: 1
      start_after_install: false
      #post_deploy: chown 0644 dtlog && zkcreate node xxx --ip ${@es}
      #post_undeploy: rm -rf /var/data/dtlog
      logs:
      - logs/output.log

承影matrix日志如下:

MATRIX-DEBUG:2023/09/13 14:07:22 cluster.go:3875: GetClusterProductList: /api/v2/cluster/products
MATRIX-DEBUG:2023/09/13 14:07:24 product_line.go:308: [ProductLine->ProductListOfProductLine] ProductListOfProductLine from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1578: [ProductName] ProductName from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product_line.go:308: [ProductLine->ProductListOfProductLine] ProductListOfProductLine from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:26 product.go:9035: [Product->CheckDeployCondition] CheckDeployCondition from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:26 cluster.go:520: RoleInfo: /api/v2/cluster/hosts/role_info?cluster_id=2
MATRIX-ERROR:2023/09/13 14:07:26 product.go:2204: [Product->ServiceGroup] handleUncheckedServicesCore warn: /var/jenkins_home/jobs/em-rel/jobs/chengying_release/workspace/chengying/chengying-server/matrix/api/impl/product.go:5156: unchecked service `` not exist
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:5218: deploy product_name:top-lab, product_version: 1.0.0, userId: 1, clusterId: 2
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4631: cluster 2 installing new instance and rolling update ...
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4492: cluster 2 rollingUpdateCore TopLab ...
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4331: cluster 2 found TopLab old instance ip: [192.168.14.6]
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4346: found TopLab new instance ip: []
MATRIX-ERROR:2023/09/13 14:07:27 product.go:5228: delete notify event error: sql: no rows in result set
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:164: [AgentClient] AgentStop with params:e8771b72-8e28-49d8-ab25-3b9ff4cbf37b
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:77: [AgentRestCore]LoopAgentRestCore: 1, request uri: /api/v1/agent/e8771b72-8e28-49d8-ab25-3b9ff4cbf37b/stopSync
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:81: [AgentRestCore]LoopAgentRestCore: 1, response body: &{ 0 <nil>}
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:154: [AgentClient] AgentStart with params:&{e8771b72-8e28-49d8-ab25-3b9ff4cbf37b 0 0 0 map[]}
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:77: [AgentRestCore]LoopAgentRestCore: 1, request uri: /api/v1/agent/e8771b72-8e28-49d8-ab25-3b9ff4cbf37b/startSyncWithParam
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:81: [AgentRestCore]LoopAgentRestCore: 1, response body: &{ 0 <nil>}
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:3622: waiting instance(15) GetStatusChan...
MATRIX-DEBUG:2023/09/13 14:07:35 instance.go:528: ExecShellList GetBySeq error: 0
MATRIX-ERROR:2023/09/13 14:07:35 product.go:3643: agent异常退出:agent run error(unexpected):stop supervisor: e8771b72-8e28-49d8-ab25-3b9ff4cbf37b
MATRIX-DEBUG:2023/09/13 14:07:35 product.go:3624: end instance(15) GetStatusChan
MATRIX-DEBUG:2023/09/13 14:07:35 instancer.go:864: [Instancer] Clear
MATRIX-DEBUG:2023/09/13 14:07:35 product.go:4498: rollingUpdateCore TopLab finish(cluster 2 some instance of TopLab update fail)
MATRIX-ERROR:2023/09/13 14:07:35 product.go:4634: 462d4cd3-3814-4fc1-b47c-2e699d02b45e update error: cluster 2 some instance of TopLab update fail
MATRIX-ERROR:2023/09/13 14:07:35 product.go:4554: sql: no rows in result set
MATRIX-ERROR:2023/09/13 14:07:35 product.go:4610: sql: no rows in result set
MATRIX-DEBUG:2023/09/13 14:07:22 cluster.go:3875: GetClusterProductList: /api/v2/cluster/products
MATRIX-DEBUG:2023/09/13 14:07:24 product_line.go:308: [ProductLine->ProductListOfProductLine] ProductListOfProductLine from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1578: [ProductName] ProductName from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product_line.go:308: [ProductLine->ProductListOfProductLine] ProductListOfProductLine from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:2003: Service: /api/v2/product/top-lab/version/1.0.0/service
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:24 product.go:1741: [Product->ProductUncheckedServices] get unchecked services from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:26 product.go:9035: [Product->CheckDeployCondition] CheckDeployCondition from EasyMatrix API
MATRIX-DEBUG:2023/09/13 14:07:26 cluster.go:520: RoleInfo: /api/v2/cluster/hosts/role_info?cluster_id=2
MATRIX-ERROR:2023/09/13 14:07:26 product.go:2204: [Product->ServiceGroup] handleUncheckedServicesCore warn: /var/jenkins_home/jobs/em-rel/jobs/chengying_release/workspace/chengying/chengying-server/matrix/api/impl/product.go:5156: unchecked service `` not exist
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:5218: deploy product_name:top-lab, product_version: 1.0.0, userId: 1, clusterId: 2
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4631: cluster 2 installing new instance and rolling update ...
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4492: cluster 2 rollingUpdateCore TopLab ...
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4331: cluster 2 found TopLab old instance ip: [192.168.14.6]
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:4346: found TopLab new instance ip: []
MATRIX-ERROR:2023/09/13 14:07:27 product.go:5228: delete notify event error: sql: no rows in result set
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:164: [AgentClient] AgentStop with params:e8771b72-8e28-49d8-ab25-3b9ff4cbf37b
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:77: [AgentRestCore]LoopAgentRestCore: 1, request uri: /api/v1/agent/e8771b72-8e28-49d8-ab25-3b9ff4cbf37b/stopSync
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:81: [AgentRestCore]LoopAgentRestCore: 1, response body: &{ 0 <nil>}
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:154: [AgentClient] AgentStart with params:&{e8771b72-8e28-49d8-ab25-3b9ff4cbf37b 0 0 0 map[]}
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:77: [AgentRestCore]LoopAgentRestCore: 1, request uri: /api/v1/agent/e8771b72-8e28-49d8-ab25-3b9ff4cbf37b/startSyncWithParam
MATRIX-DEBUG:2023/09/13 14:07:27 agent-client.go:81: [AgentRestCore]LoopAgentRestCore: 1, response body: &{ 0 <nil>}
MATRIX-DEBUG:2023/09/13 14:07:27 product.go:3622: waiting instance(15) GetStatusChan...
MATRIX-DEBUG:2023/09/13 14:07:35 instance.go:528: ExecShellList GetBySeq error: 0
MATRIX-ERROR:2023/09/13 14:07:35 product.go:3643: agent异常退出:agent run error(unexpected):stop supervisor: e8771b72-8e28-49d8-ab25-3b9ff4cbf37b
MATRIX-DEBUG:2023/09/13 14:07:35 product.go:3624: end instance(15) GetStatusChan
MATRIX-DEBUG:2023/09/13 14:07:35 instancer.go:864: [Instancer] Clear
MATRIX-DEBUG:2023/09/13 14:07:35 product.go:4498: rollingUpdateCore TopLab finish(cluster 2 some instance of TopLab update fail)
MATRIX-ERROR:2023/09/13 14:07:35 product.go:4634: 462d4cd3-3814-4fc1-b47c-2e699d02b45e update error: cluster 2 soMATRIX-DEBUG:2023/09/13 14:07:52 cluster_status_monitor.go:48: StartClusterStatusM ...

健康检查的脚本,执行返回校验存在bug

按照官方文档要求将 curl xxxx/healthcheck 返回一个数字0,但是启动后健康检查总是失败,改成什么都不返回,也是检查不通过。不得去掉健康检查才能部署成功。看代码调用cmd.Run()以后等返回结果为空表示健康检查通过。
到底这个健康检查要返回什么才行呢?请明示啊

service name top-lab is invalid accessing config

我的schema.yml定义如下图所示,使用mero打包时报错:service name top-lab is invalid accessing config 请问是哪里的问题
image
目录结构如下:
pkg-dir/
└── top-lab
├── config
│ ├── apiconf
│ │ └── xml
│ ├── lib
│ │ ├── linux
│ │ └── windows
│ ├── vue_temp
│ │ └── template-layout
│ └── xlsx
├── lib
├── logs
└── webapp

schema.yml放在pkg_dir目录下

chengyingV1.1.1添加主机失败+部署组件失败

部署承影V1.1.1以后添加主机报错如下的错误:

主机初始化失败,+ STATIC_HOST=http://192.168.14.6:8864 + NTP_SERVER=192.168.14.6 ++ dirname /tmp/exec-script-186923485 + cd /tmp ++ pwd + PWD=/tmp ++ ps -ef ++ grep easyagent ++ grep -v grep ++ awk '{print $2}' + AGENT_PID=25190 + check_linux_os + '[' -f /etc/redhat-release ']' + DistroBasedOn=RedHat ++ cat /etc/redhat-release ++ sed 's/ release.*//' ++ awk -F ' ' '{print $1}' + DIST=CentOS ++ cat /etc/redhat-release ++ sed 's/.*release //' ++ sed 's/ .*//' ++ awk -F . '{print $1}' + MAJOR_VERSION=7 ++ cat /etc/redhat-release ++ sed 's/.*release //' ++ sed 's/ .*//' ++ awk -F . '{print $2}' + MINOR_VERSION=4 + CHECK_OS=CentOS_7_4 ++ uname -m + CHECK_ARCH=x86_64 + check_data_dir + sudo mkdir -p /data ++ whoami ++ whoami + sudo chown top:top /data + installyum + yellow '\n\n===========================Yum install=====================================\n' + CONTENT='\n\n===========================Yum install=====================================\n' + case "$CHECK_OS" in + set_centos_repo_low_version + echo 'installyum: Cent

agent异常退出

*************************** 1.hdfs_journalnode ***************************
部署服务名称: hdfs_journalnode
部署服务版本: 2.8.5
部署开始时间: 2022-06-10 15:36:16
部署结束时间: 2022-06-10 15:36:24
部署结果: run fail
部署摘要: agent异常退出:agent run error(unexpected):stop supervisor: 968723ae-343a-4cbd-9f1d-5a42ebc84a46
部署配置:
{
"ServiceDisplay": "",
"Version": "2.8.5",
"Instance": {
"ConfigPaths": [
"bin/start_journalnode.sh"
],
"Logs": [
"logs/.log",
"logs/userlogs/
/*"
],
"HealthCheck": {
"Shell": "bin/healthcheck.sh 172.18.8.106 8485 18480",
"Period": "20s",
"StartPeriod": "",
"Timeout": "",
"Retries": 3
},
"RunUser": "root",
"Cmd": "bin/start_journalnode.sh",
"PostDeploy": "",
"PostUpGrade": "",
"PostUndeploy": "",
"UnInstall": "",
"PrometheusPort": "9502"
},
"Group": "hdfs",
"DependsOn": [
"hadoop_pkg"
],
"Config": {
"external_log_dir": {
"Default": "",
"Desc": "internal",
"Type": "internal",
"Value": ""
}
},
"BaseProduct": "",
"BaseProductVersion": "",
"BaseService": "",
"BaseParsed": false,
"BaseAtrribute": ""
}
部署事件:
+------------+---------------- 1 --------------------------+
| 组件信息 | 31
| 事件类型 | start
| 事件时间 | {2022-06-10 15:36:17 +0800 CST true}
| 事件结果 | start success
| 事件详情 | api/v2/instance/31/event?eventId=112

+------------+---------------- 2 --------------------------+
| 组件信息 | 31
| 事件类型 | stop
| 事件时间 | {2022-06-10 15:36:17 +0800 CST true}
| 事件结果 | stop success
| 事件详情 | api/v2/instance/31/event?eventId=111

+------------+---------------- 3 --------------------------+
| 组件信息 | 31
| 事件类型 | start
| 事件时间 | {2022-06-10 15:33:30 +0800 CST true}
| 事件结果 | start success
| 事件详情 | api/v2/instance/31/event?eventId=110

+------------+---------------- 4 --------------------------+
| 组件信息 | 31
| 事件类型 | stop
| 事件时间 | {2022-06-10 15:33:30 +0800 CST true}
| 事件结果 | stop success
| 事件详情 | api/v2/instance/31/event?eventId=109

+------------+---------------- 5 --------------------------+
| 组件信息 | 31
| 事件类型 | start
| 事件时间 | {2022-06-10 15:31:44 +0800 CST true}
| 事件结果 | start success
| 事件详情 | api/v2/instance/31/event?eventId=108

+------------+---------------- 6 --------------------------+
| 组件信息 | 31
| 事件类型 | install
| 事件时间 | {2022-06-10 15:31:44 +0800 CST true}
| 事件结果 | install success
| 事件详情 | api/v2/instance/31/event?eventId=107

hadoop组件部署问题

DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

22/06/14 10:22:49 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: user = root
STARTUP_MSG: host = localhost/127.0.0.1
STARTUP_MSG: args = [-format, -nonInteractive]
STARTUP_MSG: version = 2.8.5
STARTUP_MSG: classpath = /data/hadoop_base/etc/hadoop:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-io-2.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-net-3.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/curator-client-2.7.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/curator-framework-2.7.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/gson-2.2.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/guava-11.0.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/hadoop-annotations-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/hadoop-auth-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/hamcrest-core-1.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/htrace-core4-4.0.1-incubating.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/httpclient-4.5.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/httpcore-4.4.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jersey-core-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jersey-json-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jersey-server-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jets3t-0.9.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jettison-1.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jetty-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jetty-util-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jsch-0.1.54.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/json-smart-1.3.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/jsr305-3.0.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/junit-4.11.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/netty-3.6.2.Final.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/paranamer-2.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/servlet-api-2.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/stax-api-1.0-2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/xz-1.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/zookeeper-3.4.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/activation-1.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/asm-3.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/avro-1.7.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-codec-1.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-collections-3.2.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/hadoop-common-2.8.5-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/hadoop-common-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/common/hadoop-nfs-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/asm-3.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-io-2.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/htrace-core4-4.0.1-incubating.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/okhttp-2.4.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/okio-1.4.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-2.8.5-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-client-2.8.5-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-client-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.5-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/hdfs/hadoop-hdfs-nfs-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jersey-json-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jersey-server-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/json-io-2.5.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/log4j-1.2.17.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/xz-1.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/activation-1.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/asm-3.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-cli-1.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-codec-1.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-lang-2.6.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/commons-math-2.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/curator-client-2.7.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/curator-test-2.7.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/fst-2.50.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/guava-11.0.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/guice-3.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/java-util-1.9.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/javassist-3.18.1-GA.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jersey-client-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-api-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-client-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-common-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-registry-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-common-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-tests-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/asm-3.2.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/guice-3.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/hadoop-annotations-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/javax.inject-1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/junit-4.11.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/lib/xz-1.0.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.5-tests.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.8.5.jar:/opt/dtstack/Hadoop/hadoop_pkg/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.5.jar:/data/hadoop_base/contrib/capacity-scheduler/.jar:/data/hadoop_base/contrib/capacity-scheduler/.jar
STARTUP_MSG: build = Unknown -r Unknown; compiled by 'root' on 2021-11-16T12:01Z
STARTUP_MSG: java = 1.8.0_144
************************************************************/
22/06/14 10:22:49 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
22/06/14 10:22:49 INFO namenode.NameNode: createNameNode [-format, -nonInteractive]
22/06/14 10:22:49 INFO namenode.FSEditLog: Edit logging is async:true
22/06/14 10:22:49 INFO namenode.FSNamesystem: KeyProvider: null
22/06/14 10:22:49 INFO namenode.FSNamesystem: fsLock is fair: true
22/06/14 10:22:49 INFO namenode.FSNamesystem: Detailed lock hold time metrics enabled: false
22/06/14 10:22:49 ERROR blockmanagement.DatanodeManager: error reading hosts files:
java.io.FileNotFoundException: null (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.(FileInputStream.java:138)
at org.apache.hadoop.util.HostsFileReader.readFileToSet(HostsFileReader.java:65)
at org.apache.hadoop.hdfs.server.blockmanagement.HostFileManager.readFile(HostFileManager.java:78)
at org.apache.hadoop.hdfs.server.blockmanagement.HostFileManager.refresh(HostFileManager.java:150)
at org.apache.hadoop.hdfs.server.blockmanagement.HostFileManager.refresh(HostFileManager.java:70)
at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.(DatanodeManager.java:205)
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.(BlockManager.java:318)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:758)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:724)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1103)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1567)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1694)
22/06/14 10:22:49 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
22/06/14 10:22:49 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=false
22/06/14 10:22:49 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
22/06/14 10:22:49 INFO blockmanagement.BlockManager: The block deletion will start around 2022 Jun 14 10:22:49
22/06/14 10:22:49 INFO util.GSet: Computing capacity for map BlocksMap
22/06/14 10:22:49 INFO util.GSet: VM type = 64-bit
22/06/14 10:22:49 INFO util.GSet: 2.0% max memory 981.5 MB = 19.6 MB
22/06/14 10:22:49 INFO util.GSet: capacity = 2^21 = 2097152 entries
22/06/14 10:22:49 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
22/06/14 10:22:49 INFO blockmanagement.BlockManager: defaultReplication = 3
22/06/14 10:22:49 INFO blockmanagement.BlockManager: maxReplication = 512
22/06/14 10:22:49 INFO blockmanagement.BlockManager: minReplication = 1
22/06/14 10:22:49 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
22/06/14 10:22:49 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
22/06/14 10:22:49 INFO blockmanagement.BlockManager: encryptDataTransfer = false
22/06/14 10:22:49 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000
22/06/14 10:22:49 INFO namenode.FSNamesystem: fsOwner = root (auth:SIMPLE)
22/06/14 10:22:49 INFO namenode.FSNamesystem: supergroup = supergroup
22/06/14 10:22:49 INFO namenode.FSNamesystem: isPermissionEnabled = true
22/06/14 10:22:49 INFO namenode.FSNamesystem: Determined nameservice ID: ns1
22/06/14 10:22:49 INFO namenode.FSNamesystem: HA Enabled: false
22/06/14 10:22:49 WARN namenode.FSNamesystem: Configured NNs:
Nameservice :
NN ID nn1 => /172.18.8.104:9000

22/06/14 10:22:49 ERROR namenode.FSNamesystem: FSNamesystem initialization failed.
java.io.IOException: Invalid configuration: a shared edits dir must not be specified if HA is not enabled.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:783)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:724)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1103)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1567)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1694)
22/06/14 10:22:49 INFO namenode.FSNamesystem: Stopping services started for active state
22/06/14 10:22:49 INFO namenode.FSNamesystem: Stopping services started for standby state
22/06/14 10:22:49 WARN namenode.NameNode: Encountered exception during format:
java.io.IOException: Invalid configuration: a shared edits dir must not be specified if HA is not enabled.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:783)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:724)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1103)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1567)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1694)
22/06/14 10:22:49 ERROR namenode.NameNode: Failed to start namenode.
java.io.IOException: Invalid configuration: a shared edits dir must not be specified if HA is not enabled.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:783)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.(FSNamesystem.java:724)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1103)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1567)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1694)
22/06/14 10:22:49 INFO util.ExitUtil: Exiting with status 1
22/06/14 10:22:49 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.0.1
************************************************************/

添加主机时,如果主机与em是同一台会提示主机初始化失败

版本号:1.1.1
OS:Centos7.9
3台主机,其中一台安装了em,然后通过界面创建集群,添加这三台服务器到集群里面,安装了em这台在主机agent在初始化完成之后报错了,错误原因应该是ntpd(systemctl restart ntpd)启动失败,emserver有一个进程已经使用了ntpd。
image
image

主机初始化失败,+ STATIC_HOST=http://192.168.21.40:8864
+ NTP_SERVER=192.168.21.40
++ dirname /tmp/exec-script-258327147
+ cd /tmp
++ pwd
+ PWD=/tmp
++ ps -ef
++ grep easyagent
++ grep -v grep
++ awk '{print $2}'
+ AGENT_PID=52881
+ check_linux_os
+ '[' -f /etc/redhat-release ']'
+ DistroBasedOn=RedHat
++ cat /etc/redhat-release
++ sed 's/ release.*//'
++ awk -F ' ' '{print $1}'
+ DIST=CentOS
++ cat /etc/redhat-release
++ sed 's/.*release //'
++ sed 's/ .*//'
++ awk -F . '{print $1}'
+ MAJOR_VERSION=7
++ cat /etc/redhat-release
++ sed 's/.*release //'
++ sed 's/ .*//'
++ awk -F . '{print $2}'
+ MINOR_VERSION=9
+ CHECK_OS=CentOS_7_9
++ uname -m
+ CHECK_ARCH=x86_64
+ check_data_dir
+ sudo mkdir -p /data
++ whoami
++ whoami
+ sudo chown admin:admin /data
+ installyum
+ yellow '\
\
===========================Yum install=====================================\
'
+ CONTENT='\
\
===========================Yum install=====================================\
'
+ case \"$CHECK_OS\" in
+ set_centos_repo_high_version
+ echo 'installyu"

虽然集群那边一直标记Error,但是部署包可以正常部署,如果server和agent的ntpd冲突,希望能去除这个错误提示

MySQL组件部署

  • agent_zip=mysql.zip
  • app_dir=/opt/dtstack/MYSQL/mysql
  • agent_bin=/opt/dtstack/MYSQL/mysql/sh
  • run_user=root
  • data_dir=
  • unzip_tmp_dir=/opt/dtstack/tmp
  • DOWNLOAD_URL=http://172.18.8.104:8864/easyagent/MYSQL/5.7.35/mysql.zip
  • trap '[ "$?" -eq 0 ] || read -p "Looks like something went wrong in step ´$STEP´"' EXIT
  • install
  • STEP='install agent'
  • echo 'Use the curl download and install Please Waiting...'
    Use the curl download and install Please Waiting...
  • '[' '!' -d /opt/dtstack/tmp ']'
  • cd /opt/dtstack/tmp
  • sudo curl -L -O -s http://172.18.8.104:8864/easyagent/MYSQL/5.7.35/mysql.zip
  • install_agent
  • mkdir -p /opt/dtstack/MYSQL/mysql
  • unzip -o /opt/dtstack/tmp/mysql.zip -d /opt/dtstack/MYSQL/mysql
  • '[' '!' -f /opt/dtstack/MYSQL/mysql/sh ']'
  • echo 'cmd: /opt/dtstack/MYSQL/mysql/sh not found!'
    cmd: /opt/dtstack/MYSQL/mysql/sh not found!
  • exit 1
  • '[' 1 -eq 0 ']'
  • read -p 'Looks like something went wrong in step ´install agent´'

使用主机创建集群时出现初始化失败

image

之前在3台主机上,添加过普通主机集群,然后卸载 agent,重新安装 k8s。

[root@node-192-168-64-81 ~]# tail -f /opt/dtstack/easymanager/easyagent/logs/agent.log
AGENT-DEBUG:2022/06/10 10:23:13 tc_linux.go:79: initialize tc configuration
AGENT-DEBUG:2022/06/10 10:23:13 tc_linux.go:49: delete old tc configuration
AGENT-DEBUG:2022/06/10 10:23:13 monitor.go:214: start collectSystemMetrics...
AGENT-DEBUG:2022/06/10 10:23:13 monitor.go:64: start collectMetrics PID: 2372...
AGENT-DEBUG:2022/06/10 10:23:14 controller.go:103: recv control command: EXEC_SCRIPT
AGENT-DEBUG:2022/06/10 10:23:16 controller.go:103: recv control command: EXEC_SCRIPT
AGENT-DEBUG:2022/06/10 10:23:16 util_linux.go:137: kill pgid: 2438
AGENT-DEBUG:2022/06/10 10:23:16 controller.go:103: recv control command: EXEC_SCRIPT
AGENT-DEBUG:2022/06/10 10:23:16 util_linux.go:137: kill pgid: 2440
AGENT-DEBUG:2022/06/10 10:23:19 util_linux.go:137: kill pgid: 2396

从哪里能看到详细的日志?

添加主机操作系统版本7.4以下初始化失败

信息

  • OS: centos7.2
  • Version 1.1.1

问题描述
添加主机节点无法添加成功,看agent日志提示node_export安装失败

image

查看执行脚本判断了7.4-7.9,未兼容7.2

临时解决:
修改脚本添加7.2

image

但是不知道这个脚本是在哪个位置,暂时先手工执行一遍

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.