Giter VIP home page Giter VIP logo

kuimivm's Introduction

Project KuimiVM

你好,这里是 Kilio Kuara。

由于各方面的原因,我们最终还是将此项目开源出来了。 我们知道此项目今后可能已经没有意义了,但我们依旧希望在 QQ Bot 的历史中留下属于我们的一笔。

抛开 QQ Bot 不谈,此项目所实现的核心逻辑也是值得开源的。所以,我们选择公开此技术。

乐章的第二篇已经落幕,乐章的第三篇又在何处

Structure

  • /src KuimiVM 核心
  • /packer 打包模块
  • /tencent magic-signer server
  • /struct-define stub of AndroidQQ.apk

magic-signer-guide

此项目用于解决各种 QQ 机器人框架的 sso sign 和 tlv 加密问题。

该项目为 RPC 服务后端,并提供 HTTP API,这意味着你可以根据框架的需求实现不同 RPC 客户端。

由于该项目扮演的角色比较特殊,为了保证客户端与服务端的通信安全,需要先进行认证,才可执行业务操作。

本项目(docker 镜像 kiliokuara/vivo50,以下相同)在可控范围内(1)不会持久化 QQ 机器人的以下信息:

  • 登录凭证(token,cookie 等)
  • 需要加密的 tlv 数据
  • 需要签名的 sso 包数据

为优化业务逻辑,本项目会持久化 QQ 机器人的以下信息:

  • 设备信息
  • 由 libfekit 产生的账号相关的 Key-Value 键值对。

强烈建议自行部署 RPC 服务端,避免使用他人部署的开放的 RPC 服务端。

(1) 可控范围指 RPC 服务端的所有代码。由于项目使用到了外置库 libfekit,我们无法得知 libfekit 会持久化的信息。

支持的 QQ 版本

  • Android 8.9.58.11170

使用方法

通过 docker 部署 RPC 服务端

$ docker pull kiliokuara/vivo50:latest
$ docker run -d --restart=always \
  -e SERVER_IDENTITY_KEY=vivo50 \
  -e AUTH_KEY=kfc \
  -e PORT=8888 \
  -p 8888:8888 \
  --log-opt mode=non-blocking --log-opt max-buffer-size=4m \
  -v /home/vivo50/serverData:/app/serverData \
  -v /home/vivo50/testbot:/app/testbot \
  --name vivo50 \
  --memory 200M \
  kiliokuara/vivo50

环境变量说明:

  • SERVER_IDENTITY_KEY:RPC 服务端身份密钥,用于客户端确认服务端身份。
  • AUTH_KEY:RPC 客户端验证密钥,用于服务端确认客户端身份。
  • PORT:服务端口,默认 8888
  • MEMORY_MONITOR:内存监控器,默认关闭,值为定时器循环周期,单位为秒。

更多详情请参考 https://docs.docker.com/engine/reference/commandline/run/

内存使用参考

登录机器人前

2023-07-27 16:29:46 [DEBUG] [Vivo45#1] MemoryDumper -                  committed |  init   |  used   |   max  
2023-07-27 16:29:46 [DEBUG] [Vivo45#1] MemoryDumper -     Heap Memory:  200.0MB  | 200.0MB | 18.47MB | 200.0MB
2023-07-27 16:29:46 [DEBUG] [Vivo45#1] MemoryDumper - Non-Heap Memory:  25.25MB  | 7.31MB  | 23.27MB |   -1   
2023-07-27 16:29:51 [DEBUG] [Vivo45#2] MemoryDumper -                  committed |  init   |  used   |   max  
2023-07-27 16:29:51 [DEBUG] [Vivo45#2] MemoryDumper -     Heap Memory:  44.0MB   | 200.0MB | 12.08MB | 200.0MB
2023-07-27 16:29:51 [DEBUG] [Vivo45#2] MemoryDumper - Non-Heap Memory:  25.25MB  | 7.31MB  | 22.17MB |   -1   
2023-07-27 16:29:56 [DEBUG] [Vivo45#1] MemoryDumper -                  committed |  init   |  used   |   max  
2023-07-27 16:29:56 [DEBUG] [Vivo45#1] MemoryDumper -     Heap Memory:  44.0MB   | 200.0MB | 11.08MB | 200.0MB
2023-07-27 16:29:56 [DEBUG] [Vivo45#1] MemoryDumper - Non-Heap Memory:  25.25MB  | 7.31MB  | 21.62MB |   -1   


登录一个机器人后

2023-07-27 16:30:41 [DEBUG] [Vivo45#3] MemoryDumper -                  committed |  init   |  used   |   max  
2023-07-27 16:30:41 [DEBUG] [Vivo45#3] MemoryDumper -     Heap Memory:  52.0MB   | 200.0MB | 33.13MB | 200.0MB
2023-07-27 16:30:41 [DEBUG] [Vivo45#3] MemoryDumper - Non-Heap Memory:  44.56MB  | 7.31MB  | 41.21MB |   -1   
2023-07-27 16:30:46 [DEBUG] [Vivo45#3] MemoryDumper -                  committed |  init   |  used   |   max  
2023-07-27 16:30:46 [DEBUG] [Vivo45#3] MemoryDumper -     Heap Memory:  52.0MB   | 200.0MB | 28.15MB | 200.0MB
2023-07-27 16:30:46 [DEBUG] [Vivo45#3] MemoryDumper - Non-Heap Memory:  44.56MB  | 7.31MB  | 40.68MB |   -1

认证流程

1. 获取 RPC 服务端信息,并验证服务端的身份

首先调用 API GET /service/rpc/handshake/config,获取回应如下:

{
  "publicKey": "", // RSA 公钥,用于 客户端验证服务端身份 和下一步的 加密握手信息。 
  "timeout": 10000, // 会话过期时间(单位:毫秒)
  "keySignature": "" // 服务端公钥签名,用于 客户端验证服务端身份
}

为了防止 MITM Attack(中间人攻击),客户端需要验证服务端的身份。通过如下计算:

$clientKeySignature = $sha1(
    $sha1( ($SERVER_IDENTITY_KEY + $publicKey).getBytes() ).hex() + $SERVER_IDENTITY_KEY
).hex()

clientKeySignature 与 API 返回的 keySignature 比对即可验证服务端身份。

以 Kotlin 为例:

fun ByteArray.sha1(): ByteArray {
    return MessageDigest.getInstance("SHA1").digest(this)
}
fun ByteArray.hex(): String {
    return HexFormat.of().withLowerCase().formatHex(this)
}

val serverIdentityKey: String = ""; // 服务端 SERVER_IDENTITY_KEY
val publicKey: String = ""; // API 返回的 publicKey 字符串,该字符串是 base64 编码的 RSA 公钥。
val serverKeySignature: String = ""; // API 返回的 keySignature 字符串,该字符串是服务端计算签名。

val pKeyRsaSha1 = (serverIdentityKey + publicKey).toByteArray().sha1()
val clientKeySignature = (pKeyRsaSha1.hex() + serverIdentityKey).toByteArray().sha1().hex()

if (!clientKeySignature.equals(serverKeySignature)) {
    throw IllegalStateException("client calculated key signature doesn't match the server provides.")
}

2. 与服务端握手

在与服务端握手之前,需要客户端生成一个 16-byte AES 密钥和 4096-bit RSA 密钥对。

  • AES 密钥用于加解密握手成功之后的 WebSocket 业务通信。
  • RSA 密钥对用于防止使用 Replay Attacks(重放攻击)再次建立相同的 WebSocket 连接。

生成密钥后,调用 API POST /service/rpc/handshake/handshake,请求体如下:

{
  "clientRsa": "", // 客户端生成的 RSA 密钥对中的公钥,使用 base64 编码。
  "secret": "....", // 握手信息,使用上一步 “获取 RPC 服务端信息” 中的 publicKey,采用 RSA/ECB/PKCS1Padding 套件加密。
}

// 握手信息如下
{
  "authorizationKey": "", // 服务端的 AUTH_KEY
  "sharedKey": "", // AES 密钥
  "botid": 1234567890, // Bot QQ 号
}

回应如下:

{
  "status": 200, // 200 = 握手成功,403 = 握手失败
  "reason": "Authorization code is invalid.", // 握手失败的原因,仅握手失败会有此属性。
  "token": "", // WebSocket 通信 token,使用 base64 编码,仅握手成功会有此属性。
}

token 进行 base64 解码,至此握手过程已结束,接下来进行 WebSocket 通信。

3. 开启 WebSocket 会话

访问 API WEBSOCKET /service/rpc/session,请求需要添加以下 headers:

Authorization: $token_decoded     <--- base64 解码后的 token
X-SEC-Time: $timestamp_millis     <--- 当前时间戳,毫秒
X-SEC-Signature: $timestamp_sign  <--- 时间戳签名,使用客户端 RSA 密钥,采用 SHA256withRSA 算法签名,使用 base64 编码。

WebSocket 会话开启后,即可进行业务通信。

4. 查询 WebSocket 会话状态

WebSocket 每次发送 C2S 包之前,建议验证当前 WebSocket 会话的状态。

访问 API GET service/rpc/session/check,请求添加同 3. 开启 WebSocket 会话 的 headers。

响应状态码如下:

  • 204:会话有效
  • 403:验证失败,需要检查 headers。
  • 404:会话不存在

4. 中断 WebSocket 会话

在任务已经完成后(例如机器人下线,机器人框架关闭等),需要主动中断 WebSocket 会话。

访问 API DELETE /service/rpc/session,请求添加同 3. 开启 WebSocket 会话 的 headers。

响应状态码如下:

  • 204:会话已中断
  • 403:验证失败,需要检查 headers。
  • 404:会话不存在

WebSocket 通信格式

通用规范

  • C2S(client to server) 和 S2C(server to client)的所有的包均使用客户端 AES 密钥加密为 byte array。

  • S2C 包通用格式如下:

{
  "packetId": "", // 独一无二的包 ID
  "packetType": "", // 包类型,对应为业务操作。
  ..., // 具体包类型的其他属性
}

packetId 有如下两种情况:

  1. C2S 包包含 packetId,此 C2S 包需要服务端回应,则该 S2C 包为此 C2S 包的回应包,packetId 为此 C2S 包的 packetId
  2. 若该 S2C 包的 packetTyperpc.service.send,表示此 S2C 包需要客户端回应,需要 C2S 包包含该 S2C 包的 packetId
  • 业务遇到错误的 S2C 包格式如下:
{
    "packetId": .......,
    "packetType": "service.error",
    "message": "",
}

服务端不会主动发送业务错误的包,该 packetId 一定与 S2C 包的 packetId 对应。

业务场景

会话中断

S2C

{
    "packetType": "service.interrupt",
    "reason": "Interrupted by session invalidate",
}

客户端收到此包意味着当前 WebSocket session 已失效,需要重新握手获取新的 session。

初始化签名和加密服务

C2S

{
  "packetId": "",
  "packetType": "rpc.initialize",
  "extArgs": {
    "KEY_QIMEI36": "", // qimei 36
    "BOT_PROTOCOL": {
      "protocolValue": {
        "ver": "8.9.58",
      }
    }
  },
  "device": { // 除特殊标记,参数均为 value.toByteArray().hexString()
    "display": "",
    "product": "",
    "device": "",
    "board": "",
    "brand": "",
    "model": "",
    "bootloader": "",
    "fingerprint": "",
    "bootId": "", // raw string
    "procVersion": "",
    "baseBand": "",
    "version": {
      "incremental": "",
      "release": "",
      "codename": "",
      "sdk": 0 // int
    },
    "simInfo": "",
    "osType": "",
    "macAddress": "",
    "wifiBSSID": "",
    "wifiSSID": "",
    "imsiMd5": "",
    "imei": "", // raw string
    "apn": "",
    "androidId": "",
    "guid": ""
  },
},

S2C response

{
  "packetId": "",
  "packetType": "rpc.initialize"
}

初始化服务后才能进行 tlv 加密。

初始化过程中服务端会发送 rpc.service.send 包,详见服务端需要通过机器人框架发送包

获取 sso 签名白名单

C2S

{
  "packetId": "",
  "packetType": "rpc.get_cmd_white_list"
}

S2C response

{
  "packetId": "",
  "packetType": "rpc.get_cmd_white_list",
  "response": [
    "wtlogin.login",
    ...,
  ]
}

获取需要进行 sso 签名的包名单,帮助机器人框架判断机器人框架的网络包是否需要签名。

服务端需要通过机器人框架发送包

S2C

{
  "packetId": "server-...",
  "packetType": "rpc.service.send",
  "remark": "msf.security", // sso 包标记,可忽略
  "command": "trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey", // sso 包指令
  "botUin": 1234567890, // bot id
  "data": "" // RPC 服务端需要发送的包内容 bytes.hexString()
}

C2S response

{
  "packetId": "server-...",
  "packetType": "rpc.service.send",
  "command": "trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey",
  "data": "" // QQ 服务器包响应的内容 bytes.hexString()
}

客户端收到 rpc.service.send 后,需要将 data 包进行 sso 包装,通过机器人框架的网络层发送到 QQ 服务器。

QQ 服务器返回后,只需简单解析包的 command 等信息,将剩余内容传入 C2S response 包的 data

需要注意的是,服务端需要通过机器人框架发送包全部需要 sso 签名,所以请收到 rpc.service.send 包后调用 sso 签名对包装后的网络包进行签名。

sso 签名

C2S

{
  "packetId": "",
  "packetType": "rpc.sign",
  "seqId": 33782, // sso 包的 sequence id
  "command": "wtlogin.login", // sso 包指令
  "extArgs": {}, // 额外参数,为空
  "content": "" // sso 包内容 bytes.hexString()
}

S2C response

{
  "packetId": "",
  "packetType": "rpc.sign",
  "response": {
    "sign": "",
    "extra": "",
    "token": ""
  }
}

tlv 加密

C2S

{
  "packetId": "",
  "packetType": "rpc.tlv",
  "tlvType": 1348, // 0x544
  "extArgs": {
    "KEY_COMMAND_STR": "810_a"
  },
  "content": "" // t544 内容 bytes.hexString()
}

S2C response

{
  "packetId": "",
  "packetType": "rpc.tlv",
  "response": "" // 加密结果 bytes.hexString()
}

kuimivm's People

Contributors

d3-3109 avatar kiliokuara avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

kuimivm's Issues

docker 容器体积问题

在长时间运行docker 容器后,容器会变的异常的大。
使用

docker exec -it vivo50 /bin/bash

命令进入容器后,会发现一些 名为 core.数字 的文件,请问这些文件可以清除吗,能在容器运行时删除一些创建时间比较早的文件 以释放空间吗

docker仓库不存在报错

~/mirai-dice-release-noextra# docker login
Authenticating with existing credentials...
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store

Login Succeeded
~/mirai-dice-release-noextra# docker pull kiliokuara/vivo50:latest
Error response from daemon: pull access denied for kiliokuara/vivo50, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

docker 部署后启动报错

复现步骤:
Centos 7 执行以下指令:
$ docker pull kiliokuara/vivo50:latest
$ docker run -d --restart=always
-e SERVER_IDENTITY_KEY=vivo50
-e AUTH_KEY=kfc
-e PORT=8888
-p 8888:8888
--log-opt mode=non-blocking --log-opt max-buffer-size=4m
-v /home/vivo50/serverData:/app/serverData
-v /home/vivo50/testbot:/app/testbot
--name vivo50
--memory 200M
kiliokuara/vivo50

然后 docker ps 检查容器状态时会发现 vivo50 一直处于 restarting 状态,无限循环。
以下是 docker logs vivo50 的输出,有错误信息:
2023-07-30 03:38:22 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:22 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:23 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:23 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:24 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:24 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:26 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:26 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:27 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:27 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:30 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:30 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:34 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:34 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more
2023-07-30 03:38:42 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-30 03:38:42 [INFO ] [main] n1cbc826620954c6491c6d608dc024736 - extracting linuxfile/ls
Exception in thread "main" java.lang.RuntimeException: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at kfc.ne6e75658fda8432484ce61cae30527b9.n1cbc826620954c6491c6d608dc024736.a(Unknown Source)
at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
Caused by: java.nio.file.FileAlreadyExistsException: serverData/resources/linuxfile/ls
at java.base/sun.nio.fs.UnixException.translateToIOException(UnixException.java:94)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:106)
at java.base/sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:111)
at java.base/sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:218)
at java.base/java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:484)
at java.base/java.nio.file.Files.newOutputStream(Files.java:228)
at java.base/java.nio.file.Files.copy(Files.java:3161)
... 2 more

与fix-protocol-version对接的问题

我不太确定是否真的对接成功了,也不确定操作方式是否正确。

版本:
net.mamoe:mirai-core:2.15.0
fix-protocol-version-1.9.6

docker我放在一个screen里,运行了指令后返回了一长串数字字母,我猜应该是部署成功了

我的bot程序在另一个screen里,用kt写的,登录代码是这样的:

    FixProtocolVersion.fetch(BotConfiguration.MiraiProtocol.ANDROID_PAD, "8.9.58")
    bot = BotFactory.newBot(Config.qq, Config.password) {
        protocol = BotConfiguration.MiraiProtocol.ANDROID_PAD
        fileBasedDeviceInfo()
    }
    bot.login()

KFCFactory.json:

{
    "8.9.58": {
        "base_url": "http://127.0.0.1:8888",
        "type": "kiliokuara/magic-signer-guide",
        "serverIdentityKey": "vivo50",
        "authorizationKey": "kfc"
    }
}

端口和两个key和docker里的指令都是一致的。运行bot后会卡一分钟左右,然后出现以下报错:

2023-07-17 16:51:23 W/Net 3368816838: Exception in resumeConnection.
NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution)
        at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler.createConnection$suspendImpl(NettyNetworkHandler.kt:116)
        at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler$createConnection$1.invokeSuspend(NettyNetworkHandler.kt)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
Caused by: java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution
        at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(InetAddress.java:932)
        at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1517)
        at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:851)
        at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1507)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1366)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1300)
        at java.base/java.net.InetAddress.getByName(InetAddress.java:1250)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:156)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:153)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:554)
        at io.netty.util.internal.SocketUtils.addressByName(SocketUtils.java:153)
        at io.netty.resolver.DefaultNameResolver.doResolve(DefaultNameResolver.java:41)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:61)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:53)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:55)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:31)
        at io.netty.resolver.AbstractAddressResolver.resolve(AbstractAddressResolver.java:106)
        at io.netty.bootstrap.Bootstrap.doResolveAndConnect0(Bootstrap.java:206)
        at io.netty.bootstrap.Bootstrap.access$000(Bootstrap.java:46)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:180)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:166)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
        at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
        at io.netty.util.concurrent.DefaultPromise.setSuccess0(DefaultPromise.java:625)
        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:105)
        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:84)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetSuccess(AbstractChannel.java:990)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:516)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429)
        at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486)
        at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:831)

2023-07-17 16:51:23 W/Net 3368816838: Network selector received exception, closing bot. (NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution))
Exception in thread "main" java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution
        at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
        at java.base/java.net.InetAddress$PlatformNameService.lookupAllHostAddr(InetAddress.java:932)
        at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1517)
        at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:851)
        at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1507)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1366)
        at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1300)
        at java.base/java.net.InetAddress.getByName(InetAddress.java:1250)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:156)
        at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:153)
        at java.base/java.security.AccessController.doPrivileged(AccessController.java:554)
        at io.netty.util.internal.SocketUtils.addressByName(SocketUtils.java:153)
        at io.netty.resolver.DefaultNameResolver.doResolve(DefaultNameResolver.java:41)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:61)
        at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:53)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:55)
        at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:31)
        at io.netty.resolver.AbstractAddressResolver.resolve(AbstractAddressResolver.java:106)
        at io.netty.bootstrap.Bootstrap.doResolveAndConnect0(Bootstrap.java:206)
        at io.netty.bootstrap.Bootstrap.access$000(Bootstrap.java:46)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:180)
        at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:166)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557)
        at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492)
        at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636)
        at io.netty.util.concurrent.DefaultPromise.setSuccess0(DefaultPromise.java:625)
        at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:105)
        at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:84)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetSuccess(AbstractChannel.java:990)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:516)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429)
        at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486)
        at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
        at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
        at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
        at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base/java.lang.Thread.run(Thread.java:831)
        Suppressed: NettyChannelException(message=Failed to connect msfwifi.3g.qq.com/<unresolved>:8080, cause=java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution)
                at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler.createConnection$suspendImpl(NettyNetworkHandler.kt:116)
                at net.mamoe.mirai.internal.network.impl.netty.NettyNetworkHandler$createConnection$1.invokeSuspend(NettyNetworkHandler.kt)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
                at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
                at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
        Caused by: [CIRCULAR REFERENCE: java.net.UnknownHostException: msfwifi.3g.qq.com: Temporary failure in name resolution]

之后再运行就不卡了,直接弹code=45,和没有使用该插件一样。

该项目是否不支持i686

以如下参数运行Docker:
docker run -d --restart=always -e SERVER_IDENTITY_KEY=vivo50 -e AUTH_KEY=kfc -e PORT=8888 -p 8888:8888 --log-opt mode=non-blocking --log-opt max-buffer-size=4m -v /home/vivo50/serverData:/app/serverData -v /home/vivo50/testbot:/app/testbot kiliokuara/vivo50
相比于README中给出的命令缺少--rm是因为:#1
心情复杂.jpg
lscpu 运行结果:
image
但是我疑惑的地方在于,按理来讲这颗CPU可以执行这个吧?

请问如何升级版本支持

8.9.58用了很久依然有不少号还可以用,但是新号几乎都使用不了,所以想请问以下如果要升级版本支持的话应该如何修改?

new protocol support

请求支持更新的协议,目前部分账号已无法使用 8.9.58 协议

你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。

mira日志:

2023-09-13 16:08:19 I/KFCFactory: ANDROID_PHONE(8.9.58) server type: kiliokuara/magic-signer-guide, file:///home/mira/KFCFactory.json
2023-09-13 16:08:19 I/KFCFactory: magic-signer-guide by http://127.0.0.1:8888 about 
{"main_page":"https://github.com/kiliokuara/magic-signer-guide/issues","server_version":"70107d8e1b4acb747026c95a853523b575e7d98f","server_build_time":1690883369515,"supported_protocol_versions":["8.9.58"]}
2023-09-13 16:08:22 I/ViVo50: Bot(483392198) initialize by http://127.0.0.1:8888
2023-09-13 16:08:22 I/ViVo50: Session(bot=483392198) opened
2023-09-13 16:09:22 W/ViVo50: Session(bot=483392198) rpc.initialize timeout 60000ms
java.util.concurrent.TimeoutException
        at java.base/java.util.concurrent.CompletableFuture.timedGet(Unknown Source)
        at java.base/java.util.concurrent.CompletableFuture.get(Unknown Source)
        at fix-protocol-version-1.9.11.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendCommand(ViVo50.kt:427)
        at fix-protocol-version-1.9.11.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50.initialize(ViVo50.kt:106)
        at net.mamoe.mirai.internal.network.components.EcdhInitialPublicKeyUpdaterImpl.initializeSsoSecureEcdh(EcdhInitialPublicKeyUpdater.kt:123)
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl.login(SsoProcessor.kt:224)
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$login$1.invokeSuspend(SsoProcessor.kt)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.resumeRootWith(SuspendFunctionGun.kt:138)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:112)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun.access$loop(SuspendFunctionGun.kt:14)
        at net.mamoe.mirai.internal.deps.io.ktor.util.pipeline.SuspendFunctionGun$continuation$1.resumeWith(SuspendFunctionGun.kt:62)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)

2023-09-13 16:09:22 I/ViVo50: Bot(483392198) initialize complete
Login failed: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
2023-09-13 16:10:03 E/console: net.mamoe.mirai.network.BotAuthorizationException: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
net.mamoe.mirai.network.BotAuthorizationException: BotAuthorization(BotAuthorization.byPassword(<ERASED>)) threw an exception during authorization process. See cause below.
        at net.mamoe.mirai.internal.network.components.SsoProcessorImpl.login(SsoProcessor.kt:263)
        at net.mamoe.mirai.internal.network.handler.CommonNetworkHandler$StateConnecting$startState$2.invokeSuspend(CommonNetworkHandler.kt:247)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
        Suppressed: net.mamoe.mirai.network.WrongPasswordException: Error(bot=Bot(483392198), code=45, title=禁止登录, message=你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。, errorInfo=)
                at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$SlowLoginImpl.doLogin(SsoProcessor.kt:490)
                at net.mamoe.mirai.internal.network.components.SsoProcessorImpl$SlowLoginImpl$doLogin$1.invokeSuspend(SsoProcessor.kt)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
                at kotlinx.coroutines.internal.ScopeCoroutine.afterResume(Scopes.kt:33)
                at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:102)
                at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
                ... 5 more
Caused by: [CIRCULAR REFERENCE: net.mamoe.mirai.network.WrongPasswordException: Error(bot=Bot(483392198), code=45, title=禁止登录, message=你当前使用的QQ版本过低,请前往QQ官网im.qq.com下载最新版QQ后重试。, errorInfo=)]

2023-09-13 16:10:03 I/Bot.483392198: Bot cancelled: Bot closed

docker日志:

[Native IO       ] read: /dev/__properties__, oflags: 557056
[Native IO       ] read: /proc/stat, oflags: 524288
fekit base: 0x7cf00000
OOL: JNI_OnLoad
RSP: 65542
======================= [init fekit encrypt service] ========================
======================= [init fekit encrypt service end] ========================
2023-09-13 08:08:24 [INFO ] [Vivo45#1] a - starting vm service of bot 483392198, local debug = false
======================= [spi initialize] ========================
![DTC: MMKV CALL] mmKVValue call: o3_switch_Xwid
[FEKitLog INFO   ] [FEKit_] 1 device_token_entry.h:86 initUin 0
![DTC: MMKV CALL] mmKVValue call: kO3WhiteCmdListKey
[Native IO       ] read: /dev/urandom, oflags: 524288
[Native IO       ] read: /dev/urandom, oflags: 524288
[Native IO       ] read: /dev/urandom, oflags: 0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk, oflags: 0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk, oflags: 0
[Native IO       ] read: /dev/urandom, oflags: 0
[FEKitLog ERROR  ] [FEKit_] 1 o3_channel_encrypt.h:275 gen new channel
[FEKitLog ERROR  ] [FEKit_] 1 o3_channel_encrypt.h:491 est check: 154c619900aa3a7f5b837e5fa0f9b8ab989994ba23f8ae5efaacefd0d83b8a4080102a4e70548cd46f563faf402c94ee
[FEKitLog ERROR  ] [FEKit_] 1 ChannelManager.cpp:72 o3cm@S: GetSecConf, trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey
[!!! ChannelProxy] sendMessage: {trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey}[0] 
![DTC: MMKV CALL] mmKVSaveValue call: key=O3_1bad5c33edb3fed0, value=0
[Native IO       ] read: /data/app/com.tencent.mobileqq/base.apk!/lib/arm64-v8a/libfekit.so, oflags: 0
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"42e523ff-2d3c-484c-aa6f-de8f1cdd7895","packetType":"rpc.get_cmd_white_list"}
2023-09-13 08:09:22 [INFO ] [Vivo45#5] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"42e523ff-2d3c-484c-aa6f-de8f1cdd7895","packetType":"rpc.get_cmd_white_list","response":["OidbSvcTrpcTcp.0x55f_0","OidbSvcTrpcTcp.0x1100_1","qidianservice.269","OidbSvc.0x4ff_9_IMCore","MsgProxy.SendMsg","SQQzoneSvc.shuoshuo","OidbSvc.0x758_1","QChannelSvr.trpc.qchannel.commwriter.ComWriter.DoReply","trpc.login.ecdh.EcdhService.SsoNTLoginPasswordLoginUnusualDevice","wtlogin.device_lock","OidbSvc.0x758_0","wtlogin_device.tran_sim_emp","OidbSvc.0x4ff_9","trpc.springfestival.redpacket.LuckyBag.SsoSubmitGrade","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoReply","trpc.o3.report.Report.SsoReport","SQQzoneSvc.addReply","OidbSvc.0x8a1_7","QChannelSvr.trpc.qchannel.commwriter.ComWriter.DoComment","OidbSvcTrpcTcp.0xf67_1","friendlist.ModifyGroupInfoReq","OidbSvcTrpcTcp.0xf65_1","OidbSvcTrpcTcp.0xf65_10 ","OidbSvcTrpcTcp.0xf67_5","OidbSvc.0x56c_6","OidbSvc.0x8ba","SQQzoneSvc.like","OidbSvcTrpcTcp.0xf88_1","OidbSvc.0x8a1_0","wtlogin.name2uin","SQQzoneSvc.addComment","wtlogin.login","trpc.o3.ecdh_access.EcdhAccess.SsoSecureA2Access","OidbSvcTrpcTcp.0x101e_2","qidianservice.135","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoComment","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoBarrage","-1","OidbSvcTrpcTcp.0x101e_1","OidbSvc.0x89a_0","friendlist.addFriend","ProfileService.GroupMngReq","OidbSvc.oidb_0x758","MessageSvc.PbSendMsg","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoLike","OidbSvc.0x758","trpc.o3.ecdh_access.EcdhAccess.SsoSecureA2Establish","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoPush","qidianservice.290","trpc.qlive.relationchain_svr.RelationchainSvr.Follow","trpc.o3.ecdh_access.EcdhAccess.SsoSecureAccess","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.DoFollow","SQQzoneSvc.forward","ConnAuthSvr.sdk_auth_api","wtlogin.qrlogin","wtlogin.register","OidbSvcTrpcTcp.0x6d9_4","trpc.passwd.manager.PasswdManager.SetPasswd","friendlist.AddFriendReq","qidianservice.207","ProfileService.getGroupInfoReq","OidbSvcTrpcTcp.0x1107_1","OidbSvcTrpcTcp.0x1105_1","SQQzoneSvc.publishmood","wtlogin.exchange_emp","OidbSvc.0x88d_0","wtlogin_device.login","OidbSvcTrpcTcp.0xfa5_1","trpc.qqhb.qqhb_proxy.Handler.sso_handle","OidbSvcTrpcTcp.0xf89_1","OidbSvc.0x9fa","FeedCloudSvr.trpc.feedcloud.commwriter.ComWriter.PublishFeed","QChannelSvr.trpc.qchannel.commwriter.ComWriter.PublishFeed","OidbSvcTrpcTcp.0xf57_106","ConnAuthSvr.sdk_auth_api_emp","OidbSvcTrpcTcp.0xf6e_1","trpc.qlive.word_svr.WordSvr.NewPublicChat","trpc.passwd.manager.PasswdManager.VerifyPasswd","trpc.group_pro.msgproxy.sendmsg","OidbSvc.0x89b_1","OidbSvcTrpcTcp.0xf57_9","FeedCloudSvr.trpc.videocircle.circleprofile.CircleProfile.SetProfile","OidbSvc.0x6d9_4","OidbSvcTrpcTcp.0xf55_1","ConnAuthSvr.fast_qq_login","OidbSvcTrpcTcp.0xf57_1","trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey","wtlogin.trans_emp","StatSvc.register"]}
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:09:22 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"99e75e53-7467-44e9-a7c6-6b9965efc41b","packetType":"rpc.tlv","tlvType":1348,"extArgs":{"KEY_COMMAND_STR":"810_9"},"content":"000000000010F2B23A72C52B086752B310CF3CFBD29E000A362E302E302E323534350000000900000000"}
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
======================= [spi initialize end] ========================
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.ecdh_access.EcdhAccess.SsoEstablishShareKey with seq server-1d6a564f-1f13-4690-a251-e4a7abb7f823-53a56fc4-b132-4383-b671-caea95e3f18c of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0a476574536563436f6e661221028ef981f8a510375083901ffdb115369c4a3b55e05dd3874a86fd7c80068ec8bd22423833633536323337366539323963626161353837383730666262373264386337663363666539373731646631343830386562333061613931373932336664643563652a49bff528c38731611ceb96b24803b2418701afe74363139e08b31f34d1250dd9b975d65dddfc4c5cc3a29efb2b339402a39301dfec642fbed7b2c5549067f318ad77ec40cb6da0bccdbf32205ff06b9541c950229957006899137dc707a2e9bc46ff61d256230314031eb2873a30154c619900aa3a7f5b837e5fa0f9b8ab989994ba23f8ae5efaacefd0d83b8a4080102a4e70548cd46f563faf402c94ee
2023-09-13 08:10:02 [INFO ] [Vivo45#7] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"99e75e53-7467-44e9-a7c6-6b9965efc41b","packetType":"rpc.tlv","response":"0c0711fc57ba9215762c5f416b392c2c085e46e57aacaadea5a18c000000004042454600000000"}
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-c99a1e11-c28b-4d59-81f6-08b8ab619172-9f39f83d-e2fd-45a4-976f-a7ec509bcce1 of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b30646630303037313634361284010a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0b7369676e5f7265706f72740a04686f73740a01310a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:02 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-6f02adc2-3d50-4df6-bd40-da90903b017e-f3ecdf71-81d8-40c8-b3a5-10e9c9d18f96 of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b306466303030373136343612c6020a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0b7665726966795f66696c650a01310a422f646174612f6170702f636f6d2e74656e63656e742e6d6f62696c6571712f626173652e61706b212f6c69622f61726d36342d7638612f6c696266656b69742e736f0a40356239343431633764386339396166613136363730353265326132316632653030306237383464613830623461643431653361353033666433306361643730380a40356239343431633764386339396166613136363730353265326132316632653030306237383464613830623461643431653361353033666433306361643730380a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:02 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to check session state 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:10:02 [INFO ] [vert.x-eventloop-thread-0] a - [WEBSOCKET] receiving packet from 172.17.0.1:34984: {"packetId":"76a1c1d0-584c-425a-9ab2-b3bced25aee7","packetType":"rpc.sign","seqId":41837,"command":"wtlogin.login","extArgs":{},"content":"02050E1F41081000011CCFFAC6030700000000020000000000000000020123628F3AECAB5B36AF0968743FF6108A01310002004104745A00BDF96C55C9D0A98C3C001A2151A602D96F2F097D398D43B6C7CBAE702D6F6762CB159D92DF4AE24A93CADC4F366BA5A935AB79EA701095B2D53AE183CDE9AFA11742EF2DC657F261C67C7116F00B3CE65AA3B95BC0E889B6F6B9F75DCFC7136C994D6EB61FE30BDD47DDCD4DEEC613395CC81593DEC6A72F1EDA42B04B56192DCBBC6FC26137BDD5C76D0DC9533007C5776DA8F3F2DBDDFBB4BEA5E6F47CEDBBCD8D049C35E4FABEDEE999BEBFD695B5602BEF3D3B373EB23658BE8D387106C1BB90E7D289F4FC858E6514C81D636CFBF9E6500BAC27A58176F822ADBBA3C1072FA87B00F49359FDE9C6B6236D9F4DF223ADDB9B1EFF0F0382BB993B458160164A9B234C9E99A22AAE0394FF1B75EC0DDFA512D1F2219F9A4BABCBECEE791F57B1DBD917CD80D0B5B8F2F172E021D661B6130FF8AFCA5E864E91BEFC4E21BB2C14A3E9385750087A19AC005A1CBFA2A333077A75F67430735515CC924F3CA1F9C225A0F700D3C3403CE3FFB244D50B4A49FDD2F987F9BEBBDB16E8DD48147198799EA41BBFC1F45AC52B65F1B3CD3FA94700B7C0013226FE08991EA55541DC62B1C0DB8BF3F657690B526FB82B5EDE2AE722CCE55F89EB290B957418441D7E64872DCEA40DB356FE0BDFC92F210DD220059DF7D159DCFAB8B761D82509719CE3DB9C9451487EED3FDA01AC44635CA247C1411FD6FC058971F4B01E3852CF326BDB67B4756960E8B73A63625067C2975BD6C120099DDE6153B9DAD39413EC9076AAF0E2677E5F3E031E55758576621A328D11DA1C6D78B42375F0BC3CB311ADF479420812FCE32BB0344C79B945B2126169A503244F455E557AF84765717F1DF9201EEFBA3EE086494641B3D0AB3C7F4E30A4144E887E6A6BBC3D1774F5C79B25B0F0041CF010F11B39CB534F1C08C2AED38516E861DC3AB464AF8CC02D782A8C8DD97090FA0EDD121B3C24D302FAAA6E54B317E2CC089B897AEBD8F88CABABE44A33C22F72E3B05DDC929DEDD19CE1F27BE59DC1FE00E1843EF3D284453B38DA9CC7F5EB1B5763BAAA093C1C21FA5BB0CF141EC741A3488668F199350D9D8E9B913B7699FF76FED96E4629E4DA99E2BB3DF94804031B11A9E8843D1DCA6FEBB01D5E62580E3D1CFC866FAD9C920A0C80CD8FA50160A26096700A638083F781BA23E93D70D6D4B3025A1B7B2E824C05028B1C49803F6FE27B06AF990BC4BBB3F8EEDD784C329B6670EE31C9093C900BF45D7DDD47356D32F8D7FBC384E5B1712B14267207FFAEB9E000054A26529AA3C6AF56D8EF7D52BBFDB5BBD45844C2AEC05A35F5216B6E072458254707C774F560A25FF85FCCA1E6AC9BEFA1BBDABEB24BC7C653CF340206A7D0E9D198BA810BCA93CFB402B3A047DB1599EF5157809D60BBD678164C22D461441013026E8078B0AE19413A68C66C3860B1BB2F435E0FCA54C0F346BB0A09F385368D7F26F558EB87E2A714947199F3F90C3B791867C3168786DA51F8E0B5F702A3259A75485998701EA65CCD478ACE3FEAA8067B2C25DB691E3FFC969AECEB10ADB8E8C5DDD91B79C50C605F9962AC21583FE7D1ECE74F95E0F4076EE7D934CB64FBFE6D8EAD0C37FEC473165345EF895CFD36ED7FF6302D224A6AC3061429CB4153C29F9607E4371C650E7E4AC1167FE636CB3478A40C41175EB700409581499E9E90EAC4D892A611174E44E96379C0599D8799A3AF89A39F2F1A88402AD25A1815B371C82BBAFF8FF84DF003"}
[FEKitLog INFO   ] [FEKit_] 1 device_token.h:323 getXwId but switch is close
[!!! ChannelProxy] sendMessage: {trpc.o3.report.Report.SsoReport}[-1] 
[FEKitLog INFO   ] [FEKit_] 1 qq_sign.h:132 [GetSign] cmd:wtlogin.login
SignResult[
  extra = byte[29] 12 1B 56 31 5F 41 4E 44 5F 53 51 5F 38 2E 39 2E 35 38 5F 34 31 30 36 5F 59 59 42 5F 44
  sign  = byte[39] 0C 07 49 B6 83 F2 36 7E BD 7E B5 AA 4C 47 0E F3 72 26 46 E0 C2 63 57 F7 58 A1 92 00 00 00 00 70 6A 71 72 00 00 00 00
  token = byte[0] 
]
2023-09-13 08:10:03 [INFO ] [pool-2-thread-1] a - [WEBSOCKET] sending command trpc.o3.report.Report.SsoReport with seq server-e0e08770-35a3-4fa2-a520-97e1604e7b16-ab5c31d7-3361-403e-9db6-91a81e6bfbeb of bot 0 to 172.17.0.1:34984 by channel proxy: 0a0b3064663030303731363436127a0a1b56315f414e445f53515f382e392e35385f343130365f5959425f440a07362e322e3232310a0a656d707479546f6b656e0a2066326232336137326335326230383637353262333130636633636662643239650a24326334373966333365613066653066613435336231643361313030303131363137333062
2023-09-13 08:10:03 [INFO ] [Vivo45#2] a - [WEBSOCKET] respond packet to 172.17.0.1:34984: {"packetId":"76a1c1d0-584c-425a-9ab2-b3bced25aee7","packetType":"rpc.sign","response":{"sign":"0c0749b683f2367ebd7eb5aa4c470ef3722646e0c26357f758a19200000000706a717200000000","extra":"121b56315f414e445f53515f382e392e35385f343130365f5959425f44","token":""}}
2023-09-13 08:10:03 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] client request to invalidate session 0bdbd07c-6ac6-4a61-89ce-a7ed53be94ed.lP5vCJOqZOHwOZi6q65oLaQtsV5p/TbKdDv3YUZZFSNXAWJpYFDGsXBtGPJbxghJx8cS8BVY8B5f1v+9NzCk/rS5EAYoIJw1RE6wuedhjH59Wf5QKAiDnY9Mzk9B1d9jl0LoO0znEIRECXszbGT9MF50lNWGjqB7E7KMuXMqebL/2kV6W++QJA+VvBWiWOB290VQ2Sf5MZnQSxLhiFbGKNjO5qdM+bjZSVADnvn1Bg+0c41SKP4022lS1x82aD1zLWVNAMCzLIDj+Xa52RC95jp8hX80IbLGFP1NaAt24D9KnbYjmD70JwmSX4D1KTNTVo014pPA/OiYO2Zc+4lS4VCQImF4/D7THJLBvQrPv0vZeLOniHHRuyanhnlP/QaLDjcjRdDnk5VW8ztUtMj/8AI3+rKXjipDjUNMpOguEX0CZ8vUK0fx2nXP5OqnY5JYHmK1AwqxyOxtVrNEr6T9Oq23QSenPJKkjM41sujXh1ORc/x8PGLis0UApB/m2Nj9QW63af2eH6h4kK5KYPR9Pk2n7QP7R4+hYQ2auNZodfIH0tlqe1gpv1g5kUugXhOrdMUNHr6ubUl/V+PWf8euofHnTaD/vF38wi3KNc8AADMSygGoJ0ArK7yLwkpMnsyjf7vobw0qE0ncsS3s0M5xaDrAV3GX5ZoLH0o8O9ow8Lc=
2023-09-13 08:10:03 [INFO ] [Vivo45#3] a - session of bot 483392198 is invalidated.
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-1d6a564f-1f13-4690-a251-e4a7abb7f823-53a56fc4-b132-4383-b671-caea95e3f18c.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-6f02adc2-3d50-4df6-bd40-da90903b017e-f3ecdf71-81d8-40c8-b3a5-10e9c9d18f96.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-c99a1e11-c28b-4d59-81f6-08b8ab619172-9f39f83d-e2fd-45a4-976f-a7ec509bcce1.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:10:03 [WARN ] [vert.x-eventloop-thread-2] a - [WEBSOCKET] error receiving command result with seq server-e0e08770-35a3-4fa2-a520-97e1604e7b16-ab5c31d7-3361-403e-9db6-91a81e6bfbeb.
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
io.vertx.core.impl.NoStackTraceThrowable: Cancelled
2023-09-13 08:13:35 [INFO ] [vert.x-eventloop-thread-0] RpcServerBootstrap - [ROUTER] receiving get about page

使用的协议是8.9.58 ANDROID_PHONE

一段时间后就只能接收消息不能发送消息了

�[KCannot find exception handler from coroutineContext. �[m

�[KPlease extend SimpleListenerHost.handleException or provide a CoroutineExceptionHandler to the constructor of SimpleListenerHost�[m

�[K at net.mamoe.mirai.event.SimpleListenerHost.handleException(JvmMethodListeners.kt:192)�[m

�[K at net.mamoe.mirai.event.SimpleListenerHost$special$$inlined$CoroutineExceptionHandler$1.handleException(CoroutineExceptionHandler.kt:111)�[m

�[K at net.mamoe.mirai.internal.event.SafeListener.onEvent(SafeListener.kt:75)�[m

�[K at net.mamoe.mirai.internal.event.SafeListener$onEvent$1.invokeSuspend(SafeListener.kt)�[m

�[K ... 9 more�[m

�[KCaused by: net.mamoe.mirai.event.ExceptionInEventHandlerException: Exception in EventHandler�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:147)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.access$registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:1)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt$registerEventHandler$callMethod$2.invokeSuspend(JvmMethodListenersInternal.kt:154)�[m

�[K at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)�[m

�[K at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)�[m

�[K at kotlinx.coroutines.internal.LimitedDispatcher.run(LimitedDispatcher.kt:42)�[m

�[K at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:95)�[m

�[K ... 4 more�[m

�[KCaused by: java.lang.reflect.InvocationTargetException�[m

�[K at jdk.internal.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)�[m

�[K at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)�[m

�[K at java.base/java.lang.reflect.Method.invoke(Method.java:566)�[m

�[K at net.mamoe.mirai.internal.event.JvmMethodListenersInternalKt.registerEventHandler$callMethod$invokeWithErrorReport(JvmMethodListenersInternal.kt:140)�[m

�[K ... 10 more�[m

�[KCaused by: java.util.concurrent.ExecutionException: java.net.ConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[K at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:395)�[m

�[K at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1999)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.NettyResponseFuture.get(NettyResponseFuture.java:201)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.check(ViVo50.kt:326)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.websocket(ViVo50.kt:368)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendPacket(ViVo50.kt:380)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50$Session.sendCommand(ViVo50.kt:393)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar//xyz.cssxsh.mirai.tool.ViVo50.qSecurityGetSign(ViVo50.kt:207)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.OutgoingPacketKt.buildRawUniPacket(OutgoingPacket.kt:139)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.chat.receive.MessageSvcPbSendMsg.createToGroupImpl$mirai_core(MessageSvc.PbSendMsg.kt:744)�[m

�[K at net.mamoe.mirai.internal.network.protocol.packet.chat.receive.MessageSvc_PbSendMsgKt.createToGroup(MessageSvc.PbSendMsg.kt:585)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage$suspendImpl(MessageProtocolStrategy.kt:150)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage(MessageProtocolStrategy.kt)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.GroupMessageProtocolStrategy.createPacketsForGeneralMessage(MessageProtocolStrategy.kt:139)�[m

�[K at net.mamoe.mirai.internal.message.protocol.impl.GeneralMessageSenderProtocol$GeneralMessageSender.process(GeneralMessageSenderProtocol.kt:66)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.OutgoingMessageProcessorAdapter.process(OutgoingMessagePipelineProcessor.kt:26)�[m

�[K at net.mamoe.mirai.internal.message.protocol.outgoing.OutgoingMessageProcessorAdapter.process(OutgoingMessagePipelineProcessor.kt:20)�[m

�[K at net.mamoe.mirai.internal.pipeline.AbstractProcessorPipeline.process$suspendImpl(ProcessorPipeline.kt:287)�[m

�[K at net.mamoe.mirai.internal.pipeline.AbstractProcessorPipeline.process(ProcessorPipeline.kt)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacadeImpl.preprocessAndSendOutgoingImpl(MessageProtocolFacade.kt:361)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacadeImpl.preprocessAndSendOutgoing(MessageProtocolFacade.kt:345)�[m

�[K at net.mamoe.mirai.internal.message.protocol.MessageProtocolFacade$INSTANCE.preprocessAndSendOutgoing(MessageProtocolFacade.kt)�[m

�[K at net.mamoe.mirai.internal.contact.AbstractUserKt.sendMessageImpl(AbstractUser.kt:263)�[m

�[K at net.mamoe.mirai.internal.contact.CommonGroupImpl.sendMessage$suspendImpl(GroupImpl.kt:221)�[m

�[K at net.mamoe.mirai.internal.contact.CommonGroupImpl.sendMessage(GroupImpl.kt)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage$suspendImpl(Group.kt:208)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage(Group.kt)�[m

�[K at net.mamoe.mirai.contact.Group$sendMessage$3.invoke(Group.kt)�[m

�[K at net.mamoe.mirai.contact.Group$sendMessage$3.invoke(Group.kt)�[m

�[K at kotlin.coroutines.intrinsics.IntrinsicsKt__IntrinsicsJvmKt$createCoroutineUnintercepted$$inlined$createCoroutineFromSuspendFunction$IntrinsicsKt__IntrinsicsJvmKt$1.invokeSuspend(IntrinsicsJvm.kt:205)�[m

�[K at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)�[m

�[K at kotlin.coroutines.ContinuationKt.startCoroutine(Continuation.kt:115)�[m

�[K at me.him188.kotlin.jvm.blocking.bridge.internal.RunSuspendKt.$runSuspend$(RunSuspend.kt:18)�[m

�[K at net.mamoe.mirai.contact.Group.sendMessage(Group.kt)�[m

�[K at shitboy-0.1.10-test6.mirai2.jar//net.lawaxi.ListenerYLG.sendXenonRecallMessage(ListenerYLG.java:112)�[m

�[K at shitboy-0.1.10-test6.mirai2.jar//net.lawaxi.ListenerYLG.onGroupRecall(ListenerYLG.java:90)�[m

�[K ... 14 more�[m

�[KCaused by: java.net.ConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.channel.NettyConnectListener.onFailure(NettyConnectListener.java:179)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.channel.NettyChannelConnector$1.onFailure(NettyChannelConnector.java:108)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.SimpleChannelFutureListener.operationComplete(SimpleChannelFutureListener.java:28)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//org.asynchttpclient.netty.SimpleChannelFutureListener.operationComplete(SimpleChannelFutureListener.java:20)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:578)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:571)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:550)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:491)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:616)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:609)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:117)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:321)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:337)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)�[m

�[K at java.base/java.lang.Thread.run(Thread.java:829)�[m

�[KCaused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/0:0:0:0:0:0:0:1:9999�[m

�[KCaused by: java.net.ConnectException: Connection refused�[m

�[K at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)�[m

�[K at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:777)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:707)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)�[m

�[K at fix-protocol-version-1.9.5.mirai2.jar[private]//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)�[m

�[K at java.base/java.lang.Thread.run(Thread.java:829)�[m

�[K�[0m�[m

README文档给出的docker部署部分似乎有些问题

原内容

docker run --rm -d --restart=always \
  -e SERVER_IDENTITY_KEY=vivo50 \
  -e AUTH_KEY=kfc \
  -e PORT=8888 \
  -p 8888:8888 \
  --log-opt mode=non-blocking --log-opt max-buffer-size=4m \
  -v /home/vivo50/serverData:/app/serverData \
  -v /home/vivo50/testbot:/app/testbot \
  kiliokuara/vivo50

执行后提示
docker: Conflicting options: --restart and --rm. See 'docker run --help'.

异常java.lang.ArrayIndexOutOfBoundsException

大佬有空帮忙瞅瞅
环境; fix-protocol-version 1.9.11,
客户端jdk8,
magic-signer-guide: docker:latest qversion: 8.9.58
回显: 客户端开始使用签名服务时, 后台日志报错, 然后出现以下
在插件fix-protocol-version 使用时, signer日志报错, 报错后客户端
image

服务端日志:
gui-error-log

docker部署内存占用异常

运行环境Debian11.1

日志

2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - downloading mobile qq apk
2023-07-31 15:36:57 [INFO ] [main] n9e30d450041548ce8a3c88502172b430 - checking sha1 of apk serverData/android-8.9.58.apk
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.base/java.nio.file.Files.read(Files.java:3239)
	at java.base/java.nio.file.Files.readAllBytes(Files.java:3296)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.c(Unknown Source)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.b(Unknown Source)
	at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - unpacking resources.
2023-07-31 15:36:57 [INFO ] [main] RpcServerBootstrap - downloading mobile qq apk
2023-07-31 15:36:57 [INFO ] [main] n9e30d450041548ce8a3c88502172b430 - checking sha1 of apk serverData/android-8.9.58.apk
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.base/java.nio.file.Files.read(Files.java:3239)
	at java.base/java.nio.file.Files.readAllBytes(Files.java:3296)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.c(Unknown Source)
	at kfc.n55600a3e0bd040beb5650998c3f54f46.n9e30d450041548ce8a3c88502172b430.b(Unknown Source)
	at tencentlibfekit.vmservice.rpc.RpcServerBootstrap.main(Unknown Source)

把 --memory 200M 这一行去掉可以正常运行,但内存占用多了好多,小水管服务器撑不住

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.