Giter VIP home page Giter VIP logo

galtransl's Introduction

GalTransl

支持GPT3.5/4/Newbing/Sakura等大语言模型的Galgame自动化翻译解决方案

English

GalTransl是一套将数个基础功能上的微小创新与对GPT提示工程(Prompt Engineering)的深度利用相结合的Galgame自动化翻译工具,用于制作内嵌式翻译补丁。

前言

    GalTransl的核心是一组自动化翻译脚本,解决了使用ChatGPT自动化翻译Gal过程中已知的大部分问题,并提高了整体的翻译质量。同时,通过与其他项目的组合,打通了制作补丁的完整流程,一定程度降低了上手门槛。对此感兴趣的朋友可以通过本项目更容易的构建具有一定质量的机翻补丁,并(或许)可以尝试在此框架的基础上高效的构建更高质量的汉化补丁。

  • 特性:
  1. 支持GPT3.5、Newbing、GPT-4、Sakura等大语言模型,并通过提示工程提高了GPT的翻译质量
  2. 首创GPT字典,让GPT了解人设,准确翻译人名、人称代词与生词
  3. 通过译前、译后字典与条件字典实现灵活的自动化字典系统
  4. 实时保存缓存、自动断点续翻
  5. 结合其他项目支持多引擎脚本一键解包与注入,提供完整教程降低上手难度
  6. (新)现在也支持直接翻译srt、lrc、vtt字幕文件,mtool json文件,t++ excel文件,epub文件
  7. (新)🤗 Galtransl-7B-v1是为视觉小说翻译任务专项优化的本地模型,可在6G VRAM以上显卡部署,由sakuraumi和xd2333共同构建。

❗❗使用本工具翻译并在未做全文校对/润色的前提下发布时,请在最显眼的位置标注"GPT翻译/AI翻译补丁",而不是"个人汉化"或"AI汉化"补丁。

近期更新

  • 2024.5:更新v5,新增GalTransl-7B-V1模型,新增多种文件类型支持
  • 2024.2:更新v4版,主要支持了插件系统
  • 2023.12:更新v3版,支持基于文件的多线程 by @ryank231231
  • 2023.7:更新v2版,主要重构了代码 by @ryank231231
  • 2023.6:v1初版发布

导航

环境准备

  • 免环境版
    现在release里有winexe版本,不需要安装运行环境和依赖。

  • 下载本项目
    解压到任意位置,例如 D:\GalTransl

  • Python
    安装 Python 3.11/3.12。 下载
    安装时勾选下方 add Python to path

  • 安装Python依赖
    安装 Python 后

如果你是初学者,可以直接双击安装、更新依赖.bat来安装本项目需要的依赖。

如果你熟悉python,本项目提供Poetry进行依赖管理,可以通过以下命令安装并运行程序:

cmd
pip config set global.index-url https://pypi.tuna.tsinghua.edu.cn/simple
pip install poetry
poetry install
cd 到本仓库目录
poetry shell
python -m GalTransl -p 你的项目路径 -t 翻译后端

实用工具

名称 说明
EmEditor 文本工具:神一样的文本编辑器。下载
GalTransl_DumpInjector 脚本工具:VNTextPatch的图形化界面,综合脚本文本提取导入工具
SExtractor 脚本工具:综合脚本文本提取导入工具
DBTXT2Json_jp 脚本工具:通用双行文本与json_jp互转脚本
GARbro 引擎工具:神一样的解包工具。下载
KirikiriTools 引擎工具:Krkr、krkrz 提取、注入工具
UniversalInjectorFramework 引擎工具:sjis隧道、sjis替换模式通用注入框架
VNTextProxy 引擎工具:sjis隧道模式通用注入框架

上手教程

做一个gal内嵌翻译补丁的大致流程是:

  1. 识别引擎 -> 解包资源包拿到脚本 -> 接2.
  2. 解包脚本为日文文本 -> 翻译为中文文本 -> 构建中文脚本 -> 接3.
  3. 封包为资源包/免封包 -> 接4.
  4. 引擎支持unicode的话,直接玩 -> 引擎是shift jis的,尝试2种路线使其支持显示中文

我会分成以上4个模块分步讲解,这个段落为了让没做过的朋友也能有机会上手,会写的更照顾小白一些。

  • 建议先只跑开头一个文件的翻译,或先随便添加一些中文,导回游戏确认可以正常显示再全部翻译

(点击展开详细说明)

第一章 识别与解包

识别引擎其实很简单,通常来说,使用GARbro打开游戏目录内的任意资源包,在左下方的状态栏中就会显示引擎名称:

或者,参考资源包后缀表,比较资源包的后缀。

剧情脚本一般在一些有明显关键字的资源包,或在资源包中明显关键字的目录内,例如:scene、scenario、message、script等字样。并且脚本通常是由许多明显分章节、分人物,有的还分出了剧情和hs(例如带_h),通常多翻找几个资源包就能找到。

或者,参考Dir-A佬的教程

特别的,针对新的krkrz引擎,GARbro已经无法打开资源包,可以用KrkrzExtract项目,将游戏拖到exe上启动。然后下一个全cg存档,直接把所有剧情ctrl一遍,也可以获取到脚本文件。

第二章 提取与翻译

  • 【2.1. 提取脚本文本】
        通常情况下,本项目是结合VNTextPatch工具来解包脚本的。 VNTextPatch是由外国大佬arcusmaximus开发的支持许多引擎脚本的提取与注入的通用工具。(但并不是这些引擎都能搞定了,实测有的游戏是会提取失败的)

    VNTextPatch是使用cmd操作的,为了降低上手难度,我搓了一个图形化的界面,你可以在项目的useful_tools/GalTransl_DumpInjector内找到,点击GalTransl_DumpInjector.exe运行。
    现在,你只需要选择日文脚本目录,然后选择保存提取的日文json的目录,这里一般将日文脚本放到叫script_jp的文件夹,再新建一个gt_input目录,用于存储提取出的脚本:
图1     需要注意GalTransl全程是使用name-message格式的JSON输入、处理和输出的。JSON是什么
提取出来的json文件可以用emeditor打开,一般是这个样子的:

[
  {
    "name": "咲來",
    "message": "「ってか、白鷺学園だったらあたしと一緒じゃん。\r\nセンパイだったんですねー」"
  }
]

    其中,每个{object(对象)}是一句话,message是消息内容,如果object还带了name,说明是对话。不过可能并不是所有类型的脚本都可以带name提取,当可以正确提取name时,GalTransl的翻译质量会更好
    PS. GalTransl只支持指定格式的json文件输入,但并不是说GalTransl就与VNTextPatch工具绑定了,也可以使用SExtractor工具,现在也支持导出GalTransl需要的name-message格式JSON

  • 【2.2. GalTransl启动】
        将本项目下载下来解压到任意位置(示例中默认为D盘根目录),在项目示例文件夹sampleProject中,找到示例配置文件config.inc.yaml,将其重命名为config.yaml。另外,也将sampleProject文件夹改个名字,一般是游戏的名字。

    本教程使用GPT3.5官方API来举例。其他引擎可参考下面引擎使用章节,对应修改示例项目的config.yaml即可调用。
    先将所有提取出的日文json文件放入示例文件夹内的gt_input文件夹中,然后用任意文本编辑器编辑config.yaml文件,按注释修改以下内容:

# 代理设置
proxy:
  enableProxy: true # 是否启用代理(true/false)
  proxies:
    - address: socks5://127.0.0.1:10818 # 代理地址,也可以改成http://……
backendSpecific:
  GPT35: # GPT3.5 官方 API
    tokens: # 令牌列表
      - token: sk-xxxxxxxx # 你的令牌
        endpoint: https://api.openai.com  # 这个令牌对应的OPENAI API请求的端点,使用转发或第三方API时需要修改
      - token: sk-yyyyyyyy # 可以填多个令牌,如果你只有一个的话,把示例文件的这两行删掉
        endpoint: "" # 可以填多个令牌,如果你只有一个的话,把示例文件的这两行删掉
    defaultEndpoint: https://api.openai.com # 默认 API 端点,一般不修改

    在这里需要一个openai的api key,以及需要魔法上网来走代理访问openai官方api端点。
    如果没有api key或魔法上网的话,你还可以使用一些第三方api中转项目,例如:

  • GPT-API-free,免费API中转,提供有请求频率限制的用于测试。
  • GPT水龙头,免费API中转,每24小时可领取一个 $1.00 令牌用于开发测试 AI 产品
  • 一些收费api转发项目,例如:MAX-APIhappy api等等,以上只是举例,更多中转可以谷歌,本项目不担保它们的稳定性及可用性。

    但要注意这里获取的key是第三方的key,不能用于官方API端点。如果你使用类似项目的话,做以下额外的修改:

  enableProxy: false # 此时不用设置代理

  GPT35: 
    tokens: 
      - token: sk-xxxxxxxx # 你的第三方令牌
        endpoint: https://api.chatanywhere.com.cn  # 使用第三方API端点

    修改好项目设置后,确保你已经安装了需要的依赖(见环境准备),然后双击run.bat(免环境版双击exe),首先拖入项目文件夹,例如D:\GalTransl-main\sampleProject

接着选择gpt35:

img_gpt35

程序就会启动并开始翻译:

img_start

    但是,不建议就这样开始翻译了,请至少要先学会GPT字典的使用,为你要翻译的gal设定好各角色的人名字典,这样才能保证基本的翻译质量。

    翻译完成后,如果想手工修正,可以对缓存进行修正,并重新生成结果json,见翻译缓存章节

  • 【2.3. 构建中文脚本】
        如果你是使用GalTransl提取注入工具提取的脚本,构建同理,选择日文脚本目录、中文json目录、中文脚本保存目录,然后点'注入',即可将文本注入回脚本。但这里面有一些坑,第四章会提到。

注:

  1. 这里一般把中文脚本保存目录叫script_cn,因为日文脚本目录叫script_jp
  2. 一般使用什么工具导出,就用什么工具导入。所以要先尝试导入导出是否都正常再开始翻译。

第三章 封包或免封

    构建好中文脚本后,下一步就是想办法让游戏读取。首先目前主流引擎基本都是支持免封包读取的,可以继续参考Dir-A佬的教程,看看你要搞的引擎支不支持免封包读取。
    特别的,针对krkr/krkrz引擎,可以使用arcusmaximus大佬的KirikiriTools工具,下载里面的version.dll,丢到游戏目录里,然后在游戏目录里新建一个"unencrypted"文件夹,将脚本直接丢进去(不用新建二级目录),就可以让krkr读取

第四章 引擎与编码

    在这一章首先需要了解一下unicode、sjis(shift jis)、gbk编码的基础知识,为了偷懒在这里我还是放Dir-A佬的文章,如果你对这块不了解的话,先去读一下。

如果你在做的引擎支持unicode编码,例如krkr、Artemis引擎等,一般就可以直接玩了。但如果引擎是使用sjis编码的话,直接打开会是乱码,这时候需要通过2种路线尝试使其可以正常显示中文:

路线1:使用GBK编码注入脚本,然后修改引擎程序使其支持GBK编码
路线2:仍然使用jis编码注入脚本,但通过jis隧道或jis替换(推荐)2种方式,结合通用注入dll在运行过程中通过动态替换来显示中文

GalTransl提取注入工具的VNTextPatch模式注入脚本时默认是以sjis或unicode(utf8)编码注入的,这取决于引擎类型。

  • 使用路线1
    (注:这个模式现在有bug,有的引擎会卡死)在注入前勾选"GBK编码注入",在这个模式下所有GBK编码不支持的字符将被替换成空白,例如音符♪
    然后需要ollydbg或windbg工具,在这里下载,用于修改引擎。
    最后还是去看Dir-A佬的教程,里面有教如何下断点、修改,完全没接触过逆向的话这可能很难,但没办法,照着视频多试试。

  • 使用路线2
    在注入脚本时先什么都不勾选,如果有提示"sjis_ext.bin包含文字:xxx"的话,说明程序是以sjis编码注入的,并把这些不支持显示的字符放到script_cn目录内的sjis_ext.bin里供sjis隧道模式调用了。

jis隧道:仍然来自arcusmaximus大佬的VNTranslationTools项目中的VNTextProxy组件。VNTextPatch在将文本注入回脚本时,会将sjis编码不支持的字符临时替换为sjis编码中未定义的字符,VNTextProxy通过DLL劫持技术HOOK游戏,并在遇到这些字符时再把它还原回去。

当使用sjis隧道模式时,将script_cn内的sjis_ext.bin文件移动到游戏目录内,然后将useful_tools\VNTextProxy内的所有dll逐个丢到游戏目录内(一般推荐先试version.dll,或使用PEID/DIE等工具查输入表),运行游戏,看有没有哪个dll可以正确的hook游戏并让不显示的文本可以正常显示(不正常的话那些地方会是空的)。不正常的话,删掉这个DLL,换下一个。详细设置见此

jis替换:来自AtomCrafty大佬的UniversalInjectorFramework(通用注入框架)项目,也是通过DLL劫持技术HOOK游戏,并可以将某个字符根据设置替换成指定的另一个字符,不限编码。我建立了一套替换字典,按一些规则梳理了jis编码内不支持的简中汉字与jis支持的日文汉字的映射关系,可以满足99.99%常用简体中文汉字的正常显示(见hanzi2kanji_table.txt),并将替换功能写在了GalTransl提取注入工具内(现在SExtractor也支持替换)。在替换后结合UniversalInjectorFramework的动态Hook替换功能在游戏中将这些日文汉字替换回简中文字,实现游戏的正常显示。

当使用sjis替换模式时,可以先运行一遍GalTransl提取注入工具的注入文本,获取游戏不支持的文字列表(注入后会提示"sjis_ext.bin包含文字:xxx"),然后,勾选"sjis替换模式注入",把这些文字复制到右边的文本框内,再点击注入。注入后会获得一个sjis替换模式配置。

打开useful_tools/UniversalInjectorFramework文件夹,里面也是很多dll,也是逐个尝试,一般推荐先试winmm.dll,把目录内的uif_config.json一并复制到游戏目录,然后编辑这个json,按GalTransl提取注入工具提供的配置填写source_characterstarget_characters。然后运行游戏,如果游戏可以正常运行,并且弹出了一个像这样的控制台: img_terminal 那多半就搞定了。如果不正常的话,删掉这个DLL,尝试换下一个。 注:UniversalInjectorFramework也支持sjis隧道模式,可以设置tunnel_decoderTrue然后在mapping里填入sjis_ext.bin包含文字。详细配置文件设置见此

GalTransl核心功能介绍

介绍GPT字典、缓存、普通字典、找问题等功能。
(点击展开详细说明)

GPT字典

    GPT字典系统是使用GalTransl翻译时想提高质量的关键功能,通过补充设定的方式大幅提高翻译质量,是GPT翻译区别于传统机翻的核心。适用于gpt35、gpt4、newbing。
在程序目录中,Dict文件夹内有"通用GPT字典.txt",在项目文件夹内可以新建"项目GPT字典.txt",一般人名定义写进项目字典,通用提高翻译质量的词汇写进通用字典。

  • 举例来说,你可以提前在这里对每个角色名的中文翻译进行定义,并说明这个角色的设定,例如性别、大致年龄、职业等。通过自动给GPT喂这些设定,可以自动调整合适的人称代词他/她、称谓等,并固定人名为假名时的中文翻译。
  • 再比如,可以在这里为GPT补充一些它总是翻不对的词语,如果提供一定的解释,它会理解的更好。

  • 通过下面这个例子认识GPT字典喂人物设定的用法,每行的格式为日文[Tab]中文[Tab]解释(可不写),注意中间的连接符为TAB
フラン	芙兰	name, lady, teacher
笠間	笠间	笠間 陽菜乃’s lastname, girl
陽菜乃	阳菜乃	笠間 陽菜乃's firstname, girl
张三	张三	player's name, boy
$str20	$str20	player's codename, boy

这几条字典都是定义角色用的:

  • 第一条可以理解为我想告诉GPT:“假名フラン的翻译是芙兰,这是人名,是位女士,是老师”。这样GPT在翻译フラン先生的时候就会翻译成芙兰老师而不会是芙兰医生。
  • 二三条是同一个人的日本姓和名,经测试姓名必须拆成两行写,不然GPT3.5会不认识。
  • 第四条是设定主角的推荐写法。注意即使日文和中文相同,也要再重复一遍
  • 第五条是主角在脚本中使用占位符而不是名字时的推荐写法。
  • 设定不要太复杂,否则会让GPT多很多奇怪脑补。

  • 通过下面这个例子认识GPT字典喂生词的用法,每行的格式亦为日文[Tab]中文[Tab]解释(可不写),注意中间的连接符为TAB
大家さん  房东
あたし	我/人家	use '人家' when being cute
  • 当你发现GPT不太认识这个词,例如“大家さん”,并且这个词含义比较唯一,那么就可以像这样加进通用GPT字典里,解释不是必要的。
  • 第二行的中文写了一个多义词“我/人家”,并且在解释中写了“当扮可爱时用人家”。GPT3.5没那么聪明,但GPT4基本可以灵活运用。
  • 想让GPT更瑟?自己加字典(

在程序目录中,Dict文件夹内有"通用GPT字典.txt",在sampleProject文件夹内会有"项目GPT字典.txt",一般人名定义写进项目字典,通用提高翻译质量的词汇写进通用字典。
只有当本次发送给GPT的人名和句子中有这个词,这个词的解释才会被送进本轮的对话中。
但不要什么词都往里加什么都往里加只会害了你,推荐只写各角色的设定总是会翻错的词

运行时字典会动态的展示在每一次请求里:

img_start

常规字典

在GalTransl中,常规字典是分为"译前字典"与"译后字典"的。译前字典是在翻译前对日文的a to b替换处理,译后字典是对译后中文的a to b替换处理。

译前字典多用于一些口齿不清的矫正情况,以及多个词代表同个意思的话,可以用译前字典先统一,减少GPT字典的输入。

译后字典就是比较常见的字典,在译后将某个词替换成另一个词,但是此处我改进了一个叫"条件字典"的东西。条件字典实际上就是在替换前增加了一步判断,用于避免误替换、过度替换等情况。
每行格式为pre_jp/post_jp[tab]判断词[tab]查找词[tab]替换词

  • pre_jp/post_jp代表判断词查找的位置,定义在"翻译缓存"章节讲
  • 判断词:如果在查找位置(pre_jp/post_jp)中找到判断词,才会激活后面的替换。
  • 判断词可以在开头加"!"代表"不存在则替换",否则一般是代表"存在则替换"。
  • 判断词可以使用[or][and]关键字连接,多个[or]连接代表"有一个条件满足就进入替换",多个[and]连接代表"条件都满足才进入替换"。
  • 查找词、替换词,同普通字典,将a替换成b。

翻译缓存

开始翻译后,可以在transl_cache目录内找到翻译缓存。

翻译缓存与json_jp是一一对应的,在翻译过程中,翻译结果会优先写进缓存里,当一个文件被翻译完成后,才会出现在json_cn里。

首先,总结一些要点:

  1. 当你想重翻某句时,打开对应的翻译缓存文件,删掉该句的pre_zh整行(不要留空行)
  2. 当你想整段重翻时,直接删对应的数个object块,重翻某文件时,直接删对应的翻译缓存文件。
  3. 当GalTransl正在翻译时,不要修改正在翻译的文件的缓存,改了也会被覆写回去。
  4. json_cn结果文件 = 翻译缓存内的pre_zh/proofread_zh + 译后字典替换 + 恢复对话框
  5. 当新的post_jp与缓存内的post_jp不一致时,会触发重翻,一般发生在添加了新的译前字典时

下面是翻译缓存的典型样例:

    {
        "index": 4,
        "name": "",
        "pre_jp": "欠品していたコーヒー豆を受け取ったまでは良かったが、\r\n帰り道を歩いていると汗が吹き出してくる。",
        "post_jp": "欠品していたコーヒー豆を受け取ったまでは良かったが、\r\n帰り道を歩いていると汗が吹き出してくる。",
        "pre_zh": "领取了缺货的咖啡豆还好,\r\n但是走在回去的路上就汗流浃背了。",
        "proofread_zh": "领了缺货的咖啡豆倒是没问题,\r\n可是走在回去的路上,汗水就冒了出来。",
        "trans_by": "NewBing",
        "proofread_by": "NewBing",
        "trans_conf": 0.94,
        "doub_content": [
            "汗流浃背"
        ]
    },

解释一下每个字段的含义:

  • 基本参数:
    index 序号
    name 人名
    pre_jp 原始日文
    post_jp 处理后日文。一般来讲,post_jp = pre_jp 去除对话框 + 译前字典替换。你会代码的话也可以在此处加入自己的处理
    pre_zh 原始中文
    proofread_zh 校对的中文
    没有post_zh,post_zh在json_cn里。

  • 扩展参数:
    trans_by 翻译引擎/翻译者
    proofread_by 校对引擎/校对者
    trans_conf 翻译置信度,仅NewBing、GPT4支持,第4句0.94代表NewBing对该句的准确性有94%的把握
    doub_content 存疑片段,仅NewBing、GPT4支持,代表翻译引擎觉得翻译可能不准确的地方
    unknown_proper_noun 未知专有名词,仅NewBing、GPT4支持,方便后期人工修正
    problem 存储问题。见下方自动化找错。
    post_zh_preview 用于预览json_cn,但对它的修改并不会应用到json_cn,要修改pre_jp/proofread_zh

  • 简单讲下如何用Emeditor修缓存:选中一个文件,先右键-Emeditor打开,然后把transl_cache内所有文件全选拖进去。
    这时候标签可能会占很大位置,右键标签-自定义标签页,将"标签不合适时"改成"无",这样标签就只会在一行了(需要使用Emeditor专业版)。
    接着ctrl+f搜索,搜索你感兴趣的关键字(如problem、doub_content),勾选"搜索组群中所有文档",即可快速在所有文件中搜索,或点提取快速预览所有的问题。

  • 在确定需要修改的内容后,直接修改对应句子的pre_zh,或proofread_zh,然后重新跑一遍Galtransl,很快就会生成新的json_cn

自动化找错

GalTransl根据长期对翻译结果的观察建立了一套根据规则自动找问题的系统。

找问题系统的开启是在各个项目的`config.yaml`里,默认配置是这样的
# 问题分析机制配置
problemAnalyze:
  GPT35: # GPT35 / ChatGPT
    - 词频过高
    - 标点错漏
    - 残留日文
    - 丢失换行
    - 多加换行
    - 比日文长
  arinashiDict:
    # 格式为 aaa:<空格>bbb
    aaa: bbb
    ccc: ddd

目前支持找以下问题,将问题名按示例放到对应的翻译引擎里来激活,删掉则禁用:

  • 词频过高:某个字或符号重复超过20次会触发,用于寻找可能的乱翻情况。
  • 标点错漏:寻找括号、引号、冒号等符号的多加或丢失。
  • 残留日文:翻译后仍有日文残留的情况。
  • 丢失换行:翻译后把原有换行符(\r\n)丢了
  • 多加换行:过度脑补,自己加了换行的情况。
  • 比日文长:通常来说中文的信息量是比日文大的。所以如果某一句翻译后明显比日文长的话,说明这句的翻译可能窜行了(上一句或下一句的翻译窜到了本句)。在problem中会以"比日文长x倍"的形式记录。
  • 字典使用:用于检查GPT是否正确使用了GPT字典。

arinashi_dict是一个可以自定义规则的找问题字典,配置格式为

    # 格式为 aaa:<空格>bbb
    aaa: bbb
    ccc: ddd

设置后,程序会去寻找在日文中有aaa,但译文中没有bbb在日文中没有aaa,但在译文中有bbb两种情况。一般用于检查某些专有名词有没有被正确的翻译。

找到问题后会存在翻译缓存里,见翻译缓存章节,使用Emeditor批量提取problem关键字就可以看到目前所有的问题了,并通过修改缓存的pre_jp来修正问题。

(新) 现在还可以通过在config.yaml中配置retranslKey来批量重翻某个问题,例如 retranslKey: "残留日文"

配置文件与翻译引擎设置

本篇介绍各个翻译引擎API的调用配置。
  • 基础配置 直接读配置文件注释就好了。
---
# 通用(杂项)设置
common:
  loggingLevel: info # 日志等级(未实现) [debug/info/warning/error]
  workersPerProject: 1 # 同时翻译的文件数量,为 1 时等于单线程
  # 通用设置
  sourceLanguage: ja # 源语言。[zh-cn/zh-tw/en/ja/ko/ru/fr]
  targetLanguage: zh-cn # 目标语言。[zh-cn/zh-tw/en/ja/ko/ru/fr]
  skipRetry: false # 开启则解析结果出错时跳过循环重试,直接用"Fail Translation"占位。[True/False]
  retranslFail: false # 重启时重翻所有"Fail Translation"的句子。[True/False]
  retranslKey: "" # 重启时主动重翻在Problem或pre_jp中包含此关键字的句子,例如"残留日文"
  gpt.numPerRequestTranslate: 9 # 单次翻译句子数量,不建议太大
  gpt.streamOutputMode: true # 流式输出/打字机效果,开启方便观察过程,关闭方便观察结果(多线程下无效)[True/False]
  # NewBing/GPT4
  gpt.enableProofRead: false # (NewBing/GPT4)是否开启译后校润。[True/False]
  gpt.numPerRequestProofRead: 7 # (NewBing/GPT4)单次校润句子数量,不建议修改
  gpt.recordConfidence: false # (NewBing/GPT4)记录确信度、存疑句,GPT4官方API关掉可节约token。[True/False]
  gpt.forceNewBingHs: false # (NewBing)强制NewBing翻译hs,导致速度变得很慢且可能更容易被ban号。(考虑废弃)[True/False]
  # GPT3.5/GPT4
  gpt.translStyle: auto # (GPT3.5/4 官方API)GPT参数预设,precise更精确normal更随机,auto自动切换。[auto/precise/normal]
  gpt.degradeBackend: false # (GPT3.5/4 官方API)是否将 GPT4 的key用于 GPT3.5 的请求。[True/False]
  gpt.restoreContextMode: true # (GPT3.5/4 官方API)重启时恢复上一轮的译文前文。[True/False]
  gpt.fullContextMode: false # (GPT3.5/4 官方API)保留更多前文。开启提升效果,关闭节约数倍token消耗。[True/False]
  gpt.lineBreaksImprovementMode: false # (GPT3.5)换行符改善模式,部分减少丢换行符情况,但可能导致循环重试。(考虑废弃)[True/False]
# 代理设置
proxy:
  enableProxy: false # 是否启用代理。[True/False]
  proxies:
    - address: http://127.0.0.1:7890
      # username: foo
      # password: bar
  • 字典配置 直接读配置文件注释就好了。
# 字典
dictionary:
  defaultDictFolder: Dict # 通用字典文件夹,相对于程序目录,也可填入绝对路径
  usePreDictInName: false # 将译前字典用在name字段,可用于改名[True/False]
  usePostDictInName: false # 将译后字典用在name字段,可用于汉化name[True/False]
  # 预处理字典,按字典顺序替换
  preDict:
    - 01H字典_矫正_译前.txt # 用于口齿不清的矫正
    - 00通用字典_译前.txt
    - (project_dir)项目字典_译前.txt # (project_dir)代表字典在项目文件夹
  # GPT 字典
  gpt.dict:
    - GPT字典.txt
    - (project_dir)项目GPT字典.txt
  # 后处理字典,按字典顺序替换
  postDict:
    - 00通用字典_符号_译后.txt # 符号矫正
    - 00通用字典_译后.txt
    - (project_dir)项目字典_译后.txt
  • NewBing
    需要微软账号,使用Edge浏览器,还要梯子。然后下载EditThisCookie扩展
    访问https://www.bing.com/ ,登录后点EditThisCookie图标,点"导出Cookies",
    然后在示例项目的文件夹里新建一个newbing_cookies文件夹,然后在里面新建一个txt,名称随意,把点击导出Cookies得到的内容粘贴进去保存即可。

在配置文件中修改以下配置:

  bingGPT4:
    cookiePath:
      - newbing_cookies/cookie1.txt # 你的 cookies 文件1
      - newbing_cookies/cookie2.json # 你的 cookies 文件2,后缀不影响程序读取

cookiePath下可以将多个文件按例子往下写,当一个账号到达上限后,会切到下一个账号。

开启校润模式:
配置 gpt.enableProofRead: true,翻译完一个json后会开始对这个json自动化再润色

针对newbing拒绝翻译的情况,一个推荐的技巧是先设置gpt.numPerRequestTranslate为9或12,翻译一遍后,设置retranslFail为True,设置gpt.numPerRequestTranslate为3,再跑一遍,剩下的就是newbing死活都不会翻译的了,换引擎吧。 另外,如果脚本有将hs分开,可以单独为hs建一个项目文件夹翻,翻完合并json_jp和transl_cache。

  • GPT-3.5
    官方API调用方式见上手教程

(2023.12 模拟网页操作目前不可用)
使用模拟网页操作模式时,登录网页版账号后访问https://chat.openai.com/api/auth/session

拷贝其中"accessToken":后面双引号内的一大串内容,复制到配置里,然后调用时选择Chatgpt-gpt35引擎即可调用

  ChatGPT: # ChatGPT / GPT3.5(4) 非官方 API,模拟网页操作
    access_tokens:
      - access_token: xxx # 此处粘贴accessToken
  • GPT-4
    官方API调用方式见上手教程,api key填入 GPT4: # GPT4 官方 API中即可

(2023.12 模拟网页操作目前不可用)
使用模拟网页操作模式时,登录网页版账号后访问https://chat.openai.com/api/auth/session

拷贝其中"accessToken":后面双引号内的一大串内容,复制到配置里,然后调用时选择Chatgpt-gpt4引擎即可调用

  ChatGPT: # ChatGPT / GPT3.5(4) 非官方 API,模拟网页操作
    access_tokens:
      - access_token: xxx # 此处粘贴accessToken

开启校润模式:
配置 gpt.enableProofRead: true,翻译完一个json后会开始对这个json自动化再润色

  • Sakura
    按教程部署llama.cpp一键包(地址

然后修改配置文件来设置后端地址:

  Sakura:
    endpoint: http://127.0.0.1:8080 # 修改为server的地址

galtransl's People

Contributors

xd2333 avatar ryank231231 avatar pidanshourouzhouxd avatar noriverwater avatar isotr0py avatar gulaodeng avatar pipixia244 avatar adsf0427 avatar

Stargazers

 avatar aw00987 avatar Bostoncake avatar  avatar  avatar Jax Leo avatar  avatar tamakyi avatar  avatar beef_potato avatar Fucheng Hsieh avatar  avatar  avatar  avatar  avatar Louis Liu avatar  avatar tsukasa avatar  avatar irimmal avatar  avatar upupsa avatar  avatar  avatar  avatar  avatar  avatar  avatar luyipao avatar 深淵の鴿子 avatar Zae Chao avatar indiehacker avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar 波风水门 avatar  avatar  avatar KawaiiYusora avatar  avatar  avatar  avatar kokoko avatar  avatar  avatar 羽神晶皓 avatar  avatar KittoZheng avatar neavo avatar  avatar Tsukasa avatar  avatar  avatar  avatar Leaf Ying avatar  avatar ax_y avatar loslandy avatar  avatar diffghjkl avatar  avatar  avatar  avatar lomio avatar  avatar kiki avatar Yue avatar  avatar kop za avatar 昕蒲 avatar  avatar Pixxs avatar  avatar  avatar  avatar  avatar Jim_Di avatar kishii avatar  avatar  avatar  avatar LingC avatar yyh avatar 桜リン avatar  avatar  avatar  avatar  avatar MoeLeaf avatar  avatar CHLBC avatar  avatar wasupandceacar avatar  avatar  avatar

Watchers

Lucian avatar みづな れい avatar  avatar daniel.yan avatar Kostas Georgiou avatar シナモン avatar Nousagi avatar  avatar  avatar

galtransl's Issues

不明原因停止

Traceback (most recent call last):
File "D:\HKR_v121\GalTransl2.3.3\run_GalTransl.py", line 37, in
run_galtransl(cfg, translator)
File "D:\HKR_v121\GalTransl2.3.3\GalTransl\Runner.py", line 18, in run_galtransl
doNewBingTranslate(cfg)
File "D:\HKR_v121\GalTransl2.3.3\GalTransl\Frontend\GPT.py", line 286, in doNewBingTranslate
gptapi.batch_translate(
File "D:\HKR_v121\GalTransl2.3.3\GalTransl\Backend\BingGPT4Translate.py", line 382, in batch_translate
num, trans_result = asyncio.run(
^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\asyncio\runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "D:\HKR_v121\GalTransl2.3.3\GalTransl\Backend\BingGPT4Translate.py", line 303, in translate
await self._change_cookie()
File "D:\HKR_v121\GalTransl2.3.3\GalTransl\Backend\BingGPT4Translate.py", line 424, in _change_cookie
self.chatbot = await Chatbot.create(
^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\EdgeGPT\EdgeGPT.py", line 42, in create
await Conversation.create(self.proxy, cookies=cookies),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\EdgeGPT\conversation.py", line 103, in create
response = await client.get(
^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1757, in get
return await self.request(
^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1530, in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1617, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1645, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1682, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_client.py", line 1719, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpx_transports\default.py", line 353, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore_async\connection_pool.py", line 261, in handle_async_request
raise exc
File "D:\Python Files\python311\Lib\site-packages\httpcore_async\connection_pool.py", line 245, in handle_async_request
response = await connection.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore_async\http_proxy.py", line 271, in handle_async_request
connect_response = await self._connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore_async\connection.py", line 69, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore_async\connection.py", line 117, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore\backends\auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\httpcore\backends\asyncio.py", line 114, in connect_tcp
stream: anyio.abc.ByteStream = await anyio.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\site-packages\anyio_core_sockets.py", line 221, in connect_tcp
await event.wait()
File "D:\Python Files\python311\Lib\site-packages\anyio_backends_asyncio.py", line 1778, in wait
if await self._event.wait():
^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Python Files\python311\Lib\asyncio\locks.py", line 213, in wait
await fut
asyncio.exceptions.CancelledError

D:\HKR_v121\GalTransl2.3.3>pause
请按任意键继续. . .

Installation is stuck at building wheel

Expected behaviour

I expected to module to install very fast.

Actual behaviour

Building wheel for tiktoken-0.4.0 takes forever to run.

Steps to reproduce

OS - Windows 11 build 22621.1848
architecture - x64
python version - 3.12.0a5

執行出現以下錯誤訊息

Traceback (most recent call last):
File "E:\GalTransl-main\GalTransl\Backend\GPT3Translate.py", line 138, in asyncTranslate
for data in self.chatbot.ask_stream(prompt_req):
File "C:\Users\User\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\revChatGPT\V3.py", line 203, in ask_stream
raise t.APIConnectionError(
revChatGPT.typings.APIConnectionError: 401 Unauthorized {
"error": {
"message": "",
"type": "invalid_request_error",
"param": null,
"code": "invalid_api_key"
}
}

Please check if there is a problem with your network connection
Please check that the input is correct, or you can resolve this issue by filing an issue
Project URL: https://github.com/acheong08/ChatGPT
[2023-06-11 23:32:52,157][ERROR]Error:401 Unauthorized {
"error": {
"message": "",
"type": "invalid_request_error",
"param": null,
"code": "invalid_api_key"
新手使用,出現這個實在不知道如何解決...

对括号识别有问题

翻译诸如
キース・ジャレットの99年のアルバム、『The Melody at Night , With You』から、『My Wild Irish Rose』。
自分も目の端を拭って、笑って答える。「ありがとう」と。
这样的句子就会报错,GPT3.5会提示“多余括号”不断重试,newbing直接跳出。

GalTransl\Backend\GPT3Translate.py中的

                # 多余符号
                elif ("(" in result[key_name] or "(" in result[key_name]) and (
                    "(" not in content[i].post_jp and "(" not in content[i].post_jp
                ):
                    LOGGER.info(
                        f"->第{content[i].index}句多余括号:" + result[key_name] + "\n"
                    )
                    error_flag = True
                    break

把True改成False就行了,估计是判定规则有点问题?

两种会出错的情况。

"name": "優子",
"message": "「あら……亮くん、寝ちゃったの?」"

},
{
"message": "優子問いかけるように、声をかけるが、返事はない。"
},

=================================================
像上面的这一句,message": "優子問いかけるように、声をかけるが、返事はない。"内容第一个词是名字優子【名字】,然后GPT翻译的时候,有时候回自动给你加上"name": "優子"在上面,变成下面这样
"name": "優子",
"message": "「あら……亮くん、寝ちゃったの?」"
},
{
"name": "優子",
"message": "優子問いかけるように、声をかけるが、返事はない。"
},

然后返回的时候对不上就会出错,只返回【Failed translation】,其实内容都翻译了,就是多了gpt自作聪明的多加了"name": "優子",返回的时候对不上就会出错。

"message": "優子曾经说,「声をかけるが、返事はない。」"
这样会翻译成
"message": "優子曾经:"翻译内容”"

这样也会出错。因为。会有最后那里两个双引号在最后,另一种情况是前面会有两个双引号在前面。这种时候就会报错,返回【Failed translation】

希望大佬看看有没办修复或者避免这两种出错的情况。

GPT擅自合并文本导致行数错误重试

有些时候GPT倾向于把多段文本合并成一段,后续行数检查失败会多次重试,部分语句会接近死循环(看到#10增加了重试次数上限但我在2.1.1内没有触发)。

修改gpt.numPerRequestTranslate=1可以解决问题就是效率会很差。
是否可以在出现类似错误的时候临时调整参数?

示例如下,其中373号会被合并掉。

# Glossary
| Src | Dst(/Dst2/..) | Note |
| --- | --- | --- |
| ヨツバ | ヨツバ | boy |
| ヒメ | ヒメ | girl |

[
   {
      "id":365,
      "src":"ヨツバ「選択的な要素ってゲートのことですか?」"
   },
   {
      "id":366,
      "src":"ヒメ「そうね、変な呼び方だけど、なにしろ先端脳科学でさえ私たちがゲー トと呼んでいるものが何なのかわかっていないのだから」"
   },
   {
      "id":367,
      "src":"ヨツバ「どんな病気なんですか?」"
   },
   {
      "id":368,
      "src":"ヒメ「メンタルの問題よ」"
   },
   {
      "id":369,
      "src":"ヨツバ「精神……ですか?」"
   },
   {
      "id":370,
      "src":"ヒメ「”こころ”……と言いなさい」"
   },
   {
      "id":371,
      "src":"ヨツバ「人とのコミュニケーションが嫌だから自分の殻に閉じ籠っちゃうんですか?」"
   },
   {
      "id":372,
      "src":"ヒメ「それもきっと原因のひとつね。"
   },
   {
      "id":373,
      "src":"テレパシーはコミュニケーションの道具としてはとっても優秀だけれどね、聞きたくないこと、知りたくないことまで全て頭の中に流れてくるのよ」"
   }
]
[
   {
      "id":365,
      "dst":"ヨツバ「选择性要素是指门的事吗?」"
   },
   {
      "id":366,
      "dst":"ヒメ「是的,虽然叫法很奇怪,但连先进的脑科学也不知道我们所称之为门的东西 到底是什么」"
   },
   {
      "id":367,
      "dst":"ヨツバ「是什么病?」"
   },
   {
      "id":368,
      "dst":"ヒメ「是心理问题」"
   },
   {
      "id":369,
      "dst":"ヨツバ「精神……吗?」"
   },
   {
      "id":370,
      "dst":"ヒメ「说‘心’……」"
   },
   {
      "id":371,
      "dst":"ヨツバ「因为讨厌与人交流所以闭门不出吗?」"
   },
   {
      "id":372,
      "dst":"ヒメ「那也一定是原因之一。虽然心灵感应作为沟通工具非常优秀,但所有不想听的事情、不想知道的事情都会涌入脑海中」"
   }
]

好像有Bug

报错代码
Traceback (most recent call last):
File "c:\Users\Desktop\cqb\chat\项目示例\ChatGPT项目示例\run_galtransl.py", line 27, in
transl_api.batch_translate(
File "c:\Users\Desktop\cqb\chat\项目示例\ChatGPT项目示例\chatgpt_transl_api.py", line 236, in batch_translate
trans_result = asyncio.run(
^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\asyncio\runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\asyncio\runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Program Files\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "c:\Users\Desktop\cqb\chat\项目示例\ChatGPT项目示例\chatgpt_transl_api.py", line 186, in chatgpt_translate
if result[key_name].startswith("\r\n") and not trans_list[

bing无法使用

图片
求解答,已挂系统代理,config的vpn为false
顺便问下,哪种bing的cookie有效,必须是上面那个吗?
图片
图片

建议报错达到一定次数后自动停止翻译

如果我理解的不错,即便是失败的请求也会算一次请求吧,一直失败的话挺浪费额度的(
gpt3.5的极限亲测应该在一次24句左右,那个免费项目提供的一小时120次完全够用。
但是中间偶尔会因为莫名其妙的符号问题报错,如果不在边上看着及时调整的话就会很快刷完额度。。。

已替换EdgeGPT 但牛冰仍旧报错

使用牛冰报错,已替换EdgeGPT,cookie刚替换,ip正常

2023-08-16 01:41:52,296][INFO]Error:server rejected WebSocket connection: HTTP 400, Please wait 30 seconds
Traceback (most recent call last):
  File "/media/kaban/6BA154E35001F032/GalTransl/GalTransl/Backend/BingGPT4Translate.py", line 178, in translate
    async for final, response in self.chatbot.ask_stream(
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/EdgeGPT/EdgeGPT.py", line 162, in ask_stream
    async for response in self.chat_hub.ask_stream(
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/EdgeGPT/chathub.py", line 111, in ask_stream
    async with connect(
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/websockets/legacy/client.py", line 637, in __aenter__
    return await self
           ^^^^^^^^^^
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/websockets/legacy/client.py", line 655, in __await_impl_timeout__
    return await self.__await_impl__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/websockets/legacy/client.py", line 662, in __await_impl__
    await protocol.handshake(
  File "/home/kaban/.cache/pypoetry/virtualenvs/galtransl-0jM_G3sJ-py3.11/lib/python3.11/site-packages/websockets/legacy/client.py", line 329, in handshake
    raise InvalidStatusCode(status_code, response_headers)
websockets.exceptions.InvalidStatusCode: server rejected WebSocket connection: HTTP 400

最新release中newbing报错

翻译器:newbing
[2023-12-18 00:41:39,302][INFO]HTTP Request: GET http://www.gstatic.com/generate_204 "HTTP/1.1 204 No Content"
[2023-12-18 00:41:39,303][INFO]载入 00通用字典_译前.txt  26普通;
[2023-12-18 00:41:39,305][INFO]载入 01H字典_矫正_译前.txt  145普通;
[2023-12-18 00:41:39,306][INFO]载入 00通用字典_符号_译后.txt  9普通;23条件;
[2023-12-18 00:41:39,306][INFO]载入 00通用字典_译后.txt  1普通;5条件;
[2023-12-18 00:41:39,307][INFO]载入 GPT字典: GPT字典.txt 7个词条
[2023-12-18 00:41:39,307][INFO]载入 GPT字典: 项目GPT字典.txt 8个词条
[2023-12-18 00:41:39,307][WARNING]不使用代理
'<' not supported between instances of 'NoneType' and 'int'
Traceback (most recent call last):
  File "F:\GalTransl\GalTransl\__main__.py", line 27, in worker
    run(run_galtransl(cfg, translator))
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "F:\GalTransl\GalTransl\Runner.py", line 32, in run_galtransl
    await doNewBingTranslate(cfg, proxyPool)
  File "F:\GalTransl\GalTransl\Frontend\GPT.py", line 418, in doNewBingTranslate
    semaphore = Semaphore(projectConfig.getKey("workersPerProject"))
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\asyncio\locks.py", line 347, in __init__
    if value < 0:
       ^^^^^^^^^
TypeError: '<' not supported between instances of 'NoneType' and 'int'

尝试了3.10.11和3.11.6的py,报错相同

没有重翻NewBing拒绝翻译的句子

如题
设置部分

# 通用(杂项)设置
common:
  loggingLevel: info # 日志等级,可选 [debug/info/warning/error]
  retranslFail: True # 重翻NewBing拒绝翻译的句子。[True/False]
  multiThread: false # 多线程,[True/False](暂不可用)
  gpt.streamOutputMode: true # 流式输出/打字机效果,开启方便观察过程,关闭方便观察结果。[True/False]
  gpt.numPerRequestTranslate: 9 # 单次翻译句子数量,不建议太大
  gpt.enableProofRead: False # (NewBing/GPT4)是否开启译后校润。[True/False]
  gpt.numPerRequestProofRead: 7 # (NewBing/GPT4)单次校润句子数量,不建议修改
  gpt.degradeBackend: false # (GPT3.5/4 官方API)是否将 GPT4 的key用于 GPT3.5 的请求。[True/False]
  gpt.restoreContextMode: True # (GPT3.5/4 官方API)重启自动恢复上下文。[True/False]
  gpt.fullContextMode: True # (GPT3.5/4 官方API)尽可能多的保留前文,翻译逻辑性更好,消耗token约翻4倍。[True/False]
  gpt.lineBreaksImprovementMode: false # (GPT3.5)换行符改善模式,减少丢换行符情况,但可能导致循环重试。[True/False]
  gpt.recordConfidence: true # (GPT4)记录确信度、存疑句,GPT4官方API关掉可节约token。[True/False]
  gpt.forceNewBingHs: false # (NewBing)强制NewBing翻译hs,导致速度变得很慢且可能更容易被ban号。[True/False]

部分没有重新翻译的翻译缓存

    {
        "index": 49,
        "name": "",
        "pre_jp": "女の子が少し動くと、ワンピースの脇から見えている乳房の先に時々乳首が見えたりして。",
        "post_jp": "女の子が少し動くと、ワンピースの脇から見えている乳房の先に時々乳首が見えたりして。",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "Failed translation"
    },
    {
        "index": 50,
        "name": "",
        "pre_jp": "可愛らしいピンク色の突起は、思わず手を伸ばして触りたくなる程魅惑的だ。",
        "post_jp": "可愛らしいピンク色の突起は、思わず手を伸ばして触りたくなる程魅惑的だ。",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "Failed translation"
    },
    {
        "index": 51,
        "name": "女の子",
        "pre_jp": "「あ〜……あつい〜……」",
        "post_jp": "あ〜……あつい〜……",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "problem": "比日文长1.7倍",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "「Failed translation」"
    },
    {
        "index": 52,
        "name": "",
        "pre_jp": "そう言ってワンピースの胸元をパタパタさせたりすると、ますますその下にある可愛らしいオッパイがハッキリと俺の目に映った。",
        "post_jp": "そう言ってワンピースの胸元をパタパタさせたりすると、ますますその下にある可愛らしいオッパイがハッキリと俺の目に映った。",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "Failed translation"
    },
    {
        "index": 53,
        "name": "",
        "pre_jp": "そのサイズは完全に手のひらサイズでプリンとしていて触り心地が良さそうだ。",
        "post_jp": "そのサイズは完全に手のひらサイズでプリンとしていて触り心地が良さそうだ。",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "Failed translation"
    },
    {
        "index": 54,
        "name": "",
        "pre_jp": "そして、その可愛い乳房の先にある乳首は、プックリと膨らんでいてキレイなピンク色をしていた。",
        "post_jp": "そして、その可愛い乳房の先にある乳首は、プックリと膨らんでいてキレイなピンク色をしていた。",
        "pre_zh": "Failed translation",
        "proofread_zh": "Failed translation",
        "trans_by": "NewBing(Failed)",
        "proofread_by": "NewBing(Failed)",
        "post_zh_preview": "Failed translation"
    },

游戏的解包 封包

大佬们对这一方面说的更清楚一点嘛,纯小白文本提出来翻译后,不知道怎么封回去,让游戏运行

是否支持多开功能

比如说,把文本拆开,开多个窗口和API,分别进行机翻。
有大佬试过吗?

运行报错以及相关错误

运行时报错,尝试解决但是不知道方法,求指点,万分感谢!!!!!!
image
image
卡在这了,是依赖没弄好吗还是什么问题,求指点,万分感谢

第二个错误详情:C:\Users\11827>pip install EdgeGPT
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting EdgeGPT
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/79/a5/26ad163069c906db32b284307d55607e2095548a692bf3f1f3869f7ef210/EdgeGPT-0.13.2-py3-none-any.whl (24 kB)
Requirement already satisfied: httpx[socks]>=0.24.0 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from EdgeGPT) (0.25.0)
Collecting aiohttp (from EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/fd/01/f180d31923751fd20185c96938994823f00918ee5ac7b058edc005382406/aiohttp-3.8.6.tar.gz (7.4 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Collecting websockets (from EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/47/96/9d5749106ff57629b54360664ae7eb9afd8302fad1680ead385383e33746/websockets-11.0.3-py3-none-any.whl (118 kB)
Collecting rich (from EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/be/2a/4e62ff633612f746f88618852a626bbe24226eba5e7ac90e91dcfd6a414e/rich-13.6.0-py3-none-any.whl (239 kB)
Requirement already satisfied: certifi in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from EdgeGPT) (2023.7.22)
Collecting prompt-toolkit (from EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a9/b4/ba77c84edf499877317225d7b7bc047a81f7c2eed9628eeb6bab0ac2e6c9/prompt_toolkit-3.0.39-py3-none-any.whl (385 kB)
Requirement already satisfied: requests in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from EdgeGPT) (2.31.0)
Collecting BingImageCreator>=0.4.4 (from EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/91/5e/04629695f684182eda5d3d2dfada781f05341a48acf6ec93b46815db1a9b/BingImageCreator-0.5.0-py3-none-any.whl (6.8 kB)
Collecting regex (from BingImageCreator>=0.4.4->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/d3/10/6f2d5f8635d7714ad97ce6ade7a643358c4f3e45cde4ed12b7150734a8f3/regex-2023.10.3-cp312-cp312-win_amd64.whl (268 kB)
Requirement already satisfied: httpcore<0.19.0,>=0.18.0 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from httpx[socks]>=0.24.0->EdgeGPT) (0.18.0)
Requirement already satisfied: idna in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from httpx[socks]>=0.24.0->EdgeGPT) (3.4)
Requirement already satisfied: sniffio in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from httpx[socks]>=0.24.0->EdgeGPT) (1.3.0)
Collecting socksio==1.* (from httpx[socks]>=0.24.0->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/37/c3/6eeb6034408dac0fa653d126c9204ade96b819c936e136c5e8a6897eee9c/socksio-1.0.0-py3-none-any.whl (12 kB)
Requirement already satisfied: attrs>=17.3.0 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from aiohttp->EdgeGPT) (23.1.0)
Requirement already satisfied: charset-normalizer<4.0,>=2.0 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from aiohttp->EdgeGPT) (3.3.0)
Collecting multidict<7.0,>=4.5 (from aiohttp->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/4a/15/bd620f7a6eb9aa5112c4ef93e7031bcd071e0611763d8e17706ef8ba65e0/multidict-6.0.4.tar.gz (51 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Collecting async-timeout<5.0,>=4.0.0a3 (from aiohttp->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/a7/fa/e01228c2938de91d47b307831c62ab9e4001e747789d0b05baf779a6488c/async_timeout-4.0.3-py3-none-any.whl (5.7 kB)
Collecting yarl<2.0,>=1.0 (from aiohttp->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/5f/3f/04b3c5e57844fb9c034b09c5cb6d2b43de5d64a093c30529fd233e16cf09/yarl-1.9.2.tar.gz (184 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting frozenlist>=1.1.1 (from aiohttp->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/8c/1f/49c96ccc87127682ba900b092863ef7c20302a2144b3185412a08480ca22/frozenlist-1.4.0.tar.gz (90 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting aiosignal>=1.1.2 (from aiohttp->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/76/ac/a7305707cb852b7e16ff80eaf5692309bde30e2b1100a1fcacdc8f731d97/aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting wcwidth (from prompt-toolkit->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/58/19/a9ce39f89cf58cf1e7ce01c8bb76ab7e2c7aadbc5a2136c3e192097344f5/wcwidth-0.2.8-py2.py3-none-any.whl (31 kB)
Requirement already satisfied: urllib3<3,>=1.21.1 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from requests->EdgeGPT) (2.0.6)
Collecting markdown-it-py>=2.2.0 (from rich->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl (87 kB)
Collecting pygments<3.0.0,>=2.13.0 (from rich->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/43/88/29adf0b44ba6ac85045e63734ae0997d3c58d8b1a91c914d240828d0d73d/Pygments-2.16.1-py3-none-any.whl (1.2 MB)
Requirement already satisfied: anyio<5.0,>=3.0 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from httpcore<0.19.0,>=0.18.0->httpx[socks]>=0.24.0->EdgeGPT) (4.0.0)
Requirement already satisfied: h11<0.15,>=0.13 in c:\users\11827\appdata\local\programs\python\python312\lib\site-packages (from httpcore<0.19.0,>=0.18.0->httpx[socks]>=0.24.0->EdgeGPT) (0.14.0)
Collecting mdurl~=0.1 (from markdown-it-py>=2.2.0->rich->EdgeGPT)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl (10.0 kB)
Building wheels for collected packages: aiohttp, frozenlist, multidict, yarl
Building wheel for aiohttp (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for aiohttp (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [100 lines of output]
*********************
* Accelerated build *
*********************
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-312
creating build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\abc.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\base_protocol.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\client.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\client_exceptions.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\client_proto.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\client_reqrep.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\client_ws.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\connector.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\cookiejar.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\formdata.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\hdrs.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\helpers.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\http.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\http_exceptions.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\http_parser.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\http_websocket.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\http_writer.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\locks.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\log.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\multipart.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\payload.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\payload_streamer.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\pytest_plugin.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\resolver.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\streams.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\tcp_helpers.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\test_utils.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\tracing.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\typedefs.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_app.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_exceptions.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_fileresponse.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_log.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_middlewares.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_protocol.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_request.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_response.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_routedef.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_runner.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_server.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_urldispatcher.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\web_ws.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\worker.py -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_init_.py -> build\lib.win-amd64-cpython-312\aiohttp
running egg_info
writing aiohttp.egg-info\PKG-INFO
writing dependency_links to aiohttp.egg-info\dependency_links.txt
writing requirements to aiohttp.egg-info\requires.txt
writing top-level names to aiohttp.egg-info\top_level.txt
reading manifest file 'aiohttp.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'aiohttp' anywhere in distribution
warning: no previously-included files matching '.pyc' found anywhere in distribution
warning: no previously-included files matching '
.pyd' found anywhere in distribution
warning: no previously-included files matching '.so' found anywhere in distribution
warning: no previously-included files matching '
.lib' found anywhere in distribution
warning: no previously-included files matching '.dll' found anywhere in distribution
warning: no previously-included files matching '
.a' found anywhere in distribution
warning: no previously-included files matching '*.obj' found anywhere in distribution
warning: no previously-included files found matching 'aiohttp*.html'
no previously-included directories found matching 'docs_build'
adding license file 'LICENSE.txt'
writing manifest file 'aiohttp.egg-info\SOURCES.txt'
copying aiohttp_cparser.pxd -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_find_header.pxd -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_headers.pxi -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_helpers.pyi -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_helpers.pyx -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_http_parser.pyx -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_http_writer.pyx -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp_websocket.pyx -> build\lib.win-amd64-cpython-312\aiohttp
copying aiohttp\py.typed -> build\lib.win-amd64-cpython-312\aiohttp
creating build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_cparser.pxd.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_find_header.pxd.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_helpers.pyi.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_helpers.pyx.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_http_parser.pyx.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_http_writer.pyx.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash_websocket.pyx.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
copying aiohttp.hash\hdrs.py.hash -> build\lib.win-amd64-cpython-312\aiohttp.hash
running build_ext
building 'aiohttp._websocket' extension
creating build\temp.win-amd64-cpython-312
creating build\temp.win-amd64-cpython-312\Release
creating build\temp.win-amd64-cpython-312\Release\aiohttp
"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\11827\AppData\Local\Programs\Python\Python312\include -IC:\Users\11827\AppData\Local\Programs\Python\Python312\Include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcaiohttp/_websocket.c /Fobuild\temp.win-amd64-cpython-312\Release\aiohttp/_websocket.obj
_websocket.c
c:\users\11827\appdata\local\programs\python\python312\include\pyconfig.h(230): fatal error C1083: 无法打开包括文 件: “basetsd.h”: No such file or directory
error: command 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe' failed with exit code 2
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for aiohttp
Building wheel for frozenlist (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for frozenlist (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [39 lines of output]
*********************
* Accelerated build *
*********************
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-312
creating build\lib.win-amd64-cpython-312\frozenlist
copying frozenlist_init_.py -> build\lib.win-amd64-cpython-312\frozenlist
running egg_info
writing frozenlist.egg-info\PKG-INFO
writing dependency_links to frozenlist.egg-info\dependency_links.txt
writing top-level names to frozenlist.egg-info\top_level.txt
reading manifest file 'frozenlist.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '.pyc' found anywhere in distribution
warning: no previously-included files matching '
.pyd' found anywhere in distribution
warning: no previously-included files matching '.so' found anywhere in distribution
warning: no previously-included files matching '
.lib' found anywhere in distribution
warning: no previously-included files matching '.dll' found anywhere in distribution
warning: no previously-included files matching '
.a' found anywhere in distribution
warning: no previously-included files matching '*.obj' found anywhere in distribution
warning: no previously-included files found matching 'frozenlist*.html'
no previously-included directories found matching 'docs_build'
adding license file 'LICENSE'
writing manifest file 'frozenlist.egg-info\SOURCES.txt'
copying frozenlist_init_.pyi -> build\lib.win-amd64-cpython-312\frozenlist
copying frozenlist_frozenlist.pyx -> build\lib.win-amd64-cpython-312\frozenlist
copying frozenlist\py.typed -> build\lib.win-amd64-cpython-312\frozenlist
running build_ext
building 'frozenlist._frozenlist' extension
creating build\temp.win-amd64-cpython-312
creating build\temp.win-amd64-cpython-312\Release
creating build\temp.win-amd64-cpython-312\Release\frozenlist
"C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\11827\AppData\Local\Programs\Python\Python312\include -IC:\Users\11827\AppData\Local\Programs\Python\Python312\Include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcfrozenlist/_frozenlist.c /Fobuild\temp.win-amd64-cpython-312\Release\frozenlist/_frozenlist.obj
_frozenlist.c
c:\users\11827\appdata\local\programs\python\python312\include\pyconfig.h(230): fatal error C1083: 无法打开包括文 件: “basetsd.h”: No such file or directory
error: command 'C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe' failed with exit code 2
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for frozenlist
Building wheel for multidict (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for multidict (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [80 lines of output]
*********************
* Accelerated build *
*********************
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-312
creating build\lib.win-amd64-cpython-312\multidict
copying multidict_abc.py -> build\lib.win-amd64-cpython-312\multidict
copying multidict_compat.py -> build\lib.win-amd64-cpython-312\multidict
copying multidict_multidict_base.py -> build\lib.win-amd64-cpython-312\multidict
copying multidict_multidict_py.py -> build\lib.win-amd64-cpython-312\multidict
copying multidict_init_.py -> build\lib.win-amd64-cpython-312\multidict
running egg_info
writing multidict.egg-info\PKG-INFO
writing dependency_links to multidict.egg-info\dependency_links.txt
writing top-level names to multidict.egg-info\top_level.txt
reading manifest file 'multidict.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files found matching 'multidict_multidict.html'
warning: no previously-included files found matching 'multidict*.so'
warning: no previously-included files found matching 'multidict*.pyd'
warning: no previously-included files found matching 'multidict*.pyd'
no previously-included directories found matching 'docs_build'
adding license file 'LICENSE'
writing manifest file 'multidict.egg-info\SOURCES.txt'
C:\Users\11827\AppData\Local\Temp\pip-build-env-zszuss5j\overlay\Lib\site-packages\setuptools\command\build_py.py:204: _Warning: Package 'multidict._multilib' is absent from the packages configuration.
!!

          ********************************************************************************
          ############################
          # Package would be ignored #
          ############################
          Python recognizes 'multidict._multilib' as an importable package[^1],
          but it is absent from setuptools' `packages` configuration.

          This leads to an ambiguous overall configuration. If you want to distribute this
          package, please make sure that 'multidict._multilib' is explicitly added
          to the `packages` configuration field.

          Alternatively, you can also rely on setuptools' discovery methods
          (for example by using `find_namespace_packages(...)`/`find_namespace:`
          instead of `find_packages(...)`/`find:`).

          You can read more about "package discovery" on setuptools documentation page:

          - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html

          If you don't want 'multidict._multilib' to be distributed and are
          already explicitly excluding 'multidict._multilib' via
          `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
          you can try to use `exclude_package_data`, or `include-package-data=False` in
          combination with a more fine grained `package-data` configuration.

          You can read more about "package data files" on setuptools documentation page:

          - https://setuptools.pypa.io/en/latest/userguide/datafiles.html


          [^1]: For Python, any directory (with suitable naming) can be imported,
                even if it does not contain any `.py` files.
                On the other hand, currently there is no concept of package data
                directory, all directories are treated like packages.
          ********************************************************************************

  !!
    check.warn(importable)
  copying multidict\__init__.pyi -> build\lib.win-amd64-cpython-312\multidict
  copying multidict\py.typed -> build\lib.win-amd64-cpython-312\multidict
  running build_ext
  building 'multidict._multidict' extension
  creating build\temp.win-amd64-cpython-312
  creating build\temp.win-amd64-cpython-312\Release
  creating build\temp.win-amd64-cpython-312\Release\multidict
  "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\11827\AppData\Local\Programs\Python\Python312\include -IC:\Users\11827\AppData\Local\Programs\Python\Python312\Include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcmultidict/_multidict.c /Fobuild\temp.win-amd64-cpython-312\Release\multidict/_multidict.obj -O2
  _multidict.c
  c:\users\11827\appdata\local\programs\python\python312\include\pyconfig.h(230): fatal error C1083: 无法打开包括文 件: “basetsd.h”: No such file or directory
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\x86_amd64\\cl.exe' failed with exit code 2
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for multidict
Building wheel for yarl (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for yarl (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [55 lines of output]
C:\Users\11827\AppData\Local\Temp\pip-build-env-g5awptgd\overlay\Lib\site-packages\setuptools\config\setupcfg.py:293: _DeprecatedConfig: Deprecated config in setup.cfg
!!

          ********************************************************************************
          The license_file parameter is deprecated, use license_files instead.

          By 2023-Oct-30, you need to update your project and remove deprecated calls
          or your builds will no longer be supported.

          See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
          ********************************************************************************

  !!
    parsed = self.parsers.get(option_name, lambda x: x)(value)
  **********************
  * Accelerated build *
  **********************
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build\lib.win-amd64-cpython-312
  creating build\lib.win-amd64-cpython-312\yarl
  copying yarl\_quoting.py -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\_quoting_py.py -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\_url.py -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\__init__.py -> build\lib.win-amd64-cpython-312\yarl
  running egg_info
  writing yarl.egg-info\PKG-INFO
  writing dependency_links to yarl.egg-info\dependency_links.txt
  writing requirements to yarl.egg-info\requires.txt
  writing top-level names to yarl.egg-info\top_level.txt
  reading manifest file 'yarl.egg-info\SOURCES.txt'
  reading manifest template 'MANIFEST.in'
  warning: no previously-included files matching '*.pyc' found anywhere in distribution
  warning: no previously-included files matching '*.cache' found anywhere in distribution
  warning: no previously-included files found matching 'yarl\*.html'
  warning: no previously-included files found matching 'yarl\*.so'
  warning: no previously-included files found matching 'yarl\*.pyd'
  no previously-included directories found matching 'docs\_build'
  adding license file 'LICENSE'
  writing manifest file 'yarl.egg-info\SOURCES.txt'
  copying yarl\__init__.pyi -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\_quoting_c.pyi -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\_quoting_c.pyx -> build\lib.win-amd64-cpython-312\yarl
  copying yarl\py.typed -> build\lib.win-amd64-cpython-312\yarl
  running build_ext
  building 'yarl._quoting_c' extension
  creating build\temp.win-amd64-cpython-312
  creating build\temp.win-amd64-cpython-312\Release
  creating build\temp.win-amd64-cpython-312\Release\yarl
  "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\BIN\x86_amd64\cl.exe" /c /nologo /O2 /W3 /GL /DNDEBUG /MD -IC:\Users\11827\AppData\Local\Programs\Python\Python312\include -IC:\Users\11827\AppData\Local\Programs\Python\Python312\Include "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\INCLUDE" "-IC:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.10240.0\ucrt" /Tcyarl/_quoting_c.c /Fobuild\temp.win-amd64-cpython-312\Release\yarl/_quoting_c.obj
  _quoting_c.c
  c:\users\11827\appdata\local\programs\python\python312\include\pyconfig.h(230): fatal error C1083: 无法打开包括文 件: “basetsd.h”: No such file or directory
  error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\VC\\BIN\\x86_amd64\\cl.exe' failed with exit code 2
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for yarl
Failed to build aiohttp frozenlist multidict yarl
ERROR: Could not build wheels for aiohttp, frozenlist, multidict, yarl, which is required to install pyproject.toml-based projects

遇到一个bug,如果「」文本框里面为空,则会出现整个文本文件都读取失败的情况

4W)CI_BP@@M)X)F%2Q5UH)S
Traceback (most recent call last):
File "C:\Users\Administrator\Desktop\GalTransl-core\run_GalTransl.py", line 37, in
run_galtransl(cfg, translator)
File "C:\Users\Administrator\Desktop\GalTransl-core\GalTransl\Runner.py", line 14, in run_galtransl
doGPT3Translate(cfg, type="unoffapi")
File "C:\Users\Administrator\Desktop\GalTransl-core\GalTransl\Frontend\GPT.py", line 120, in doGPT3Translate
doGPT3TranslateSingleFile(
File "C:\Users\Administrator\Desktop\GalTransl-core\GalTransl\Frontend\GPT.py", line 42, in doGPT3TranslateSingleFile
tran.analyse_dialogue() # 解析是否为对话
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Administrator\Desktop\GalTransl-core\GalTransl\CSentense.py", line 70, in analyse_dialogue
and ord(last_symbol) - ord(first_symbol) == 1 # 是同一对
^^^^^^^^^^^^^^^^
TypeError: ord() expected a character, but string of length 0 found
错误提示如上

newbing报错

如题,添加cookie后报错 无法翻译 请大佬帮忙解决
8d3c4aecb8eac8318aab3a0ef952a75

使用UniversalInjectorFramework进行劫持时无法找到对应的dll

BGI/Ethornell引擎,把winmm.dll和uif_config.json放到游戏根目录后启动游戏,启动失败并生成了uif_log.log文件,
提示如下:
DllMain: attach
InstallDelayedAttachHook: start
InstallDelayedAttachHook: no target module specified
InstallDelayedAttachHook: transaction
InstallDelayedAttachHook: end
游戏根目录下也找不到能够被劫持的dll

GalTransl 提取注入工具GBK编码注入有问题

引擎是ADVHD(WillPlus),游戏是[220325] [Guilty dash] 女体でもてなす接待旅館,SJIS是可以正常注入,但是选择GBK编码注入后文件会停止第一个文件,然后工具就变未响应了
图片

是否可以加一个超时重试功能

最近我用GalTransl加GPT3.5 Turbo的时候,经常性的突然无法收到GPT API回复,持续时间可能是无限,有一次我特地等了一个小时也没有任何变化。必须手动关闭程序然后再启动来继续下去。
这有可能是OpenAI服务器的问题。
是否可以加一个超时重试机制?比如超过30秒没有收到任何回复就重新发一次请求。

大佬咨询一下环境依赖问题

大佬,想问一下我这个依赖为什么一直安装不了啊,我之前安装过python的3.10,然后按照教程把之前的python删除,重新安装的3.12,挂的美国vpn,然后就就弹出以下错误,一直安装不了依赖。

捕获1
捕获2

请求支持更多语言

建议增加原始文本为英语、韩语、俄语等语言的支持,如果目标语言也能切换就更好了。

bing无法使用

我在newbing提取了cookies,但是一直显示以下错误
image
image
image
请问大佬该如何解决?

gpt4 报错

image

[2023-12-16 09:42:43,310][INFO]->输出:
[2023-12-16 09:42:46,321][ERROR]-> 400 bad request {"error":{"message":"max_tokens is too large: 6151. this model supports at most 4096 completion tokens, whereas you provided 6151. (request id: 20231216014244983811118aaztsald) (request id: 20231216014244797273456z8koew4k) (request id: 20231216094244703802638pwrldvix) (request id: 20231216094244672886887sivpsufv)","type":"invalid_request_error","param":"","code":null}}
[2023-12-16 09:42:46,321][ERROR]-> 报错, 5秒后重试
Traceback (most recent call last):
File "F:\GalTransl\GalTransl\Backend\GPT4Translate.py", line 242, in translate
for data in self.chatbot.ask_stream(prompt_req):
File "C:\Users\hyominli\AppData\Local\Programs\Python\Python310\lib\site-packages\revChatGPT\V3.py", line 244, in ask_stream
raise t.APIConnectionError(
revChatGPT.typings.APIConnectionError: 400 Bad Request {"error":{"message":"max_tokens is too large: 6151. This model supports at most 4096 completion tokens, whereas you provided 6151. (request id: 20231216014244983811118AaZTsAld) (request id: 20231216014244797273456Z8koEw4K) (request id: 20231216094244703802638pwrldVIx) (request id: 20231216094244672886887sIvpsuFv)","type":"invalid_request_error","param":"","code":null}}

请问“单次翻译句子数量”是指“一次请求”所翻译的句子数量吗?

那样的话,理论上是否单次翻译句子数量越多,就能在newbing每天200次请求的额度内翻译更多的句子?

我的测试貌似一次15句左右是极限,再往上就会出错。

想用newbing翻译H的话,单次翻译句子数量填为1就可以了,不过不知道这样是不是会太浪费额度了

还望大佬解答,非常感谢!

关于注入sjis问题

请问注入脚本后没有提示"sjis_ext.bin包含文字:xxx",但是封包运行后还是会漏字是什么情况?并且注入脚本的文件夹是会生成sjis_ext.bin文件的😂 Majiro引擎,之前跑过的都有正常提示sjis包含的文字

脚本,json和注入后都打包了
test.zip

使用gpt3.5模拟网页端操作时报错

翻译器:chatgpt-gpt35
[2023-11-06 16:27:58,516][INFO]载入 00通用字典_译前.txt  26普通;
[2023-11-06 16:27:58,516][INFO]载入 01H字典_矫正_译前.txt  145普通;
[2023-11-06 16:27:58,516][INFO]载入 00通用字典_符号_译后.txt  9普通;23条件;
[2023-11-06 16:27:58,516][INFO]载入 00通用字典_译后.txt  1普通;5条件;
[2023-11-06 16:27:58,516][INFO]载入 GPT字典: GPT字典.txt 7个词条
[2023-11-06 16:27:58,518][INFO]载入 GPT字典: 项目GPT字典.txt 8个词条
urllib3.exceptions.SSLError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)    
The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\connectionpool.py", line 845, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\urllib3\util\retry.py", line 515, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='bypass.churchless.tech', port=443): Max retries exceeded with url: /conversations (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "F:\GalTransl\run_GalTransl.py", line 37, in <module>
    run_galtransl(cfg, translator)
  File "F:\GalTransl\GalTransl\Runner.py", line 14, in run_galtransl
    doGPT3Translate(cfg, type="unoffapi")
  File "F:\GalTransl\GalTransl\Frontend\GPT.py", line 110, in doGPT3Translate
    gptapi = CGPT35Translate(projectConfig, type)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\GalTransl\GalTransl\Backend\GPT3Translate.py", line 143, in __init__
    self.chatbot.clear_conversations()
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\revChatGPT\V1.py", line 102, in wrapper
    out = func(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\revChatGPT\V1.py", line 1005, in clear_conversations
    response = self.session.patch(url, data='{"is_visible": false}')
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 661, in patch
    return self.request("PATCH", url, data=data, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\pc\AppData\Local\Programs\Python\Python311\Lib\site-packages\requests\adapters.py", line 517, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='bypass.churchless.tech', port=443): Max retries exceeded with url: /conversations (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)')))

报错内容如上所示,已经安装了相关依赖项,py版本为3.11.6

Python3.12依赖项安装失败问题

Python3.12安装依赖时遇到如下问题:
(1)需求列表内内pyYAML6.0.0无法正确安装
(2)安装依赖项tiktoken时会发生以下错误无法进行安装

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.