Tip
Learn more about 📘 Using DeepLX by checking it out.
DeepL Free API (No TOKEN required)
Home Page: https://deeplx.owo.network/
License: MIT License
Tip
Learn more about 📘 Using DeepLX by checking it out.
您好,感谢您的杰出工作。我用docker版本部署了服务,用起来很爽,可以用您发布的版本去更新docker版本的服务吗?我用amd64版本的二进制文件替换了docker中根目录的deepl文件,发现并不能生效,服务不能使用了。
Currently API is incompatible with neither DeepL nor Libretranslate one.
DeepL API: Uses FormData as body, also all paths starts with /v2/
POST /v2/translate HTTP/2
Content-Type: application/x-www-form-urlencoded
text=Hello%2C%20world!&target_lang=DE
200 OK
Content-Type: application/json
{
"translations": [
{
"detected_source_language": "EN",
"text": "Hallo, Welt!"
}
]
}
Libretranslate API: Have other parameter names
POST /translate HTTP/2
Content-Type: application/json
{
q: "Hello world!",
source: "auto",
target: "de"
}
200 OK
Content-Type: application/json
{
"detectedLanguage": {
"confidence": 92,
"language": "en"
},
"translatedText": "Hallo Welt!"
}
➜ ~ curl -X POST 'http://localhost:1188/translate' -H 'Content-Type:application/json' -d '{
"text": "alternatives",
"source_lang":"EN",
"target_lang": "ZH"
}'
{"alternatives":null,"code":200,"data":"","id":8378427666}%
➜ ~ yay -Q deeplx-bin
deeplx-bin 0.8.0-1
Aug 19 12:22:42 pi-sxing deeplx[500]: [GIN] 2023/08/19 - 12:22:42 | 200 | 1.029138522s | 127.0.0.1 | POST "/translate"
使用 yay -S deeplx-bin
安装在 manjaro arm
以前是可以用的
macOS bob社区版 bob插件,配置好后总是提示:配置遇到未知错误,状态码:undefined
It would be great if I can directly use this as a library instead of having this as an extra server that has to be run separately
In the function getTimeStamp in the calculation of i_count need to +1, because I download the code in the local start often reported too many requests, by checking the web version of deepl source code, found that the initial value of i_count should be 1, and then the statistics need to translate the text of the character "i The number of "i" in the text to be translated, with this total number plus 1 is the way to calculate the timestamp; I added this line of code and then recompiled without this problem.
Deployed on deno.dev
Hi
Could you add a proxy option?
Hello, after referring to your code implementation principle, I wrote a JS version which works fine when debugging locally. However, when I tried to deploy it to the edge function, it always reported a 525 error. May I ask why?
However, when I deployed your go version on the server, it worked normally and now I am seeking help.
// src/pages/api/translate.ts
import { NextResponse } from "next/server";
import type { NextRequest } from "next/server";
interface RequestParams {
text: string;
source_lang: string;
target_lang: string;
}
interface ResponseParams {
id: number;
code: number;
data: string;
}
async function queryAPI(data: RequestParams): Promise<ResponseParams> {
const res = await fetch("https://www2.deepl.com/jsonrpc", {
headers: {
"Content-Type": "application/json; charset=utf-8",
},
method: "POST",
body: buildBodyString(data),
});
if (res.ok) {
const result = (await res.json()) as {
jsonrpc: string;
id: number;
result: {
texts: {
text: string;
}[];
};
};
return {
id: result.id,
code: 200,
data: result?.result?.texts?.[0]?.text,
};
}
return {
id: 42,
code: res.status,
data:
res.status === 429
? "Too many requests, please try again later."
: "Unknown error.",
};
}
function buildRequestParams(sourceLang: string, targetLang: string) {
return {
jsonrpc: "2.0",
method: "LMT_handle_texts",
id: Math.floor(Math.random() * 100000 + 100000) * 1000,
params: {
texts: [{ text: "", requestAlternatives: 3 }],
timestamp: 0,
splitting: "newlines",
lang: {
source_lang_user_selected: sourceLang,
target_lang: targetLang,
},
},
};
}
function getCountOfI(translateText: string) {
return translateText.split("i").length - 1;
}
function getTimestamp(iCount: number) {
let ts = new Date().getTime();
if (iCount !== 0) {
iCount = iCount + 1;
return ts - (ts % iCount) + iCount;
} else {
return ts;
}
}
function buildBodyString(data: RequestParams) {
const post_data = buildRequestParams(
data.source_lang || "AUTO",
data.target_lang || "AUTO"
);
post_data.params.texts = [{ text: data.text, requestAlternatives: 3 }];
post_data.params.timestamp = getTimestamp(getCountOfI(data.text));
let post_str = JSON.stringify(post_data);
if (
[0, 3].includes((post_data["id"] + 5) % 29) ||
(post_data["id"] + 3) % 13 === 0
) {
post_str = post_str.replace('"method":"', '"method" : "');
} else {
post_str = post_str.replace('"method":"', '"method": "');
}
return post_str;
}
export const config = {
runtime: "edge",
};
export default async function MyEdgeFunction(request: NextRequest) {
const req = (await request.json()) as RequestParams;
const res = await queryAPI(req);
return NextResponse.json(res);
}
curl --location 'https://deeplx-edge-api.vercel.app/api/translate' \
--header 'Content-Type: application/json' \
--data '{
"text": "你好呀,请问你来自哪里",
"source_lang": "auto",
"target_lang": "en"
}'
{"id":42,"code":525,"data":"Unknown error."}
curl --location '127.0.0.1:3000/api/translate' \
--header 'Content-Type: application/json' \
--data '{
"text": "请给我一个冰淇淋",
"source_lang": "zh",
"target_lang": "en"
}'
{"id":115601000,"code":200,"data":"Please give me an ice cream"}
Would it be possible to use deepl's auto-detect language api for translation?
brew services start owo-network/brew/deeplx
==> Successfully started deeplx
(label: homebrew.mxcl.deeplx)
macOS 12.6 (21G115)
deeplx 0.7.4
Bob 0.10.0 29
http://127.0.0.1:1188/
{"code":200,"message":"DeepL Free API, Made by sjlleo and missuo. Go to /translate with POST. http://github.com/OwO-Network/DeepLX"}
您好,我想在Windows电脑上Zotero里面输入密钥配置deepl进行翻译,您的教程我没有太看懂,Windows具体应该下载哪一个文件(最新版两个Windows文件无法下载,浏览器会提示不安全),不知道您方不方便给一个Windows的配置教程,非常感谢
http://123.207.59.59:8100/translate/?"text"= "Hello World"&"source_lang"= "EN"&"target_lang"= "ZH"
{"code":404,"message":"No Translate Text Found"}
get method works.
{'code': 429, 'message': 'Too Many Requests'}
我Docker上部署完后打不到,请问是什么问题?
有多个句子需要一次性翻译,如果逐句翻译的话,容易429被封,是否能支持一次请求翻译多个句子
Is there a GUI to use this in windows?
I am not an advanced technical user, I just know that I pay every month for Deepl Pro, and I don't want to pay for it anymore, is there a way to use this in windows in an easy way?
Due to the long request, HTTP 429 error occurs, can you add and integrate multiple ip proxies in deeplx_windows_amd64.exe?
Related projects: https://github.com/HaliComing/fpp
如题,不知道作者是否有做过这方面的测试, 求分享
how to solve pls?
这个插件运行还是要用docker安装服务器才行是吗。还是说我自己操作有问题,他其实已经把服务器部署到本地了?
Using brew to install and update will be more user friendly.
It was my first time, and I just tried this [tbb@hai ~]$ curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST 0.0.0.0:1188/translate {"code":429,"message":"Too Many Requests"}[tbb@hai ~]$ curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST 0.0.0.0:1188/translate
server detects source lang if there is no source_lang
param:
Line 155 in eb6afa8
问题:
当我运行"launchctl load /Library/LaunchAgents/me.missuo.deeplx.plist"时,
终端输出:
Load failed: 5: Input/output error
Try running
launchctl bootstrap as root for richer errors.
平台:m2 mac
I wrote a program to translate SRT subtitle files, translating the input file line by line, but unfortunately I reached the free usage limit on DeepL not even finishing my first subtitle :P
I know this is not related to DeepLX, but the description misleaded me by saying: "deeplx is unlimited to the number of requests.", perhaps deeplx is unilimited, but as it has deepl as its backend, it has, in fact, limit for requests.
I've searched on the DeepL website and I didn't found the requests limit anywhere, but clearle it has one.
乌龙……地址忘换成新的地址了,0.8.0版本没问题
(base) ➜ ~ brew services start owo-network/brew/deeplx
Error: uninitialized constant Homebrew::Service::System
/opt/homebrew/Library/Taps/homebrew/homebrew-services/cmd/services.rb:61:in services' /opt/homebrew/Library/Homebrew/brew.rb:94:in
Setup on [immersive-translate]
What is Set the URL?
I have implemented in the translation comicrack plugin the call to your 'DeepL X' and it works very well but after a while of translating it has blocked me. In fact, it doesn't even let me translate from the web. I am attaching the log that the caliber plugin gives me.
Original: Modern medicine is not actually miraculous, although the term is often used. For all practical purposes, the man was gone before any aid could be given him.
No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 5 segundos.
Reintentando... (el tiempo de espera es de 300 segundos).
No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 10 segundos.
Reintentando... (el tiempo de espera es de 300 segundos).
No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 30 segundos.
Reintentando... (el tiempo de espera es de 300 segundos).
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "runpy.py", line 196, in _run_module_as_main
File "runpy.py", line 86, in _run_code
File "site.py", line 83, in
File "site.py", line 78, in main
File "site.py", line 50, in run_entry_point
File "calibre\utils\ipc\worker.py", line 215, in main
File "calibre\utils\ipc\worker.py", line 150, in arbitrary_n
File "calibre_plugins.ebook_translator.convertion", line 124, in convert_book
File "calibre\ebooks\conversion\plumber.py", line 1281, in run
File "calibre_plugins.ebook_translator.convertion", line 120, in convert
File "calibre_plugins.ebook_translator.translation", line 109, in handle
File "calibre_plugins.ebook_translator.translation", line 87, in _handle
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 63, in _translate
Exception: No se pudieron recuperar los datos de la API del motor de traducción. No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests
======================== END LOG =============================
I would like to choose when to use formal or informal tone in those languages supporting those tones. Now it mixes those styles in the translation.
I know this is a feautre on DeepL Pro, but I would like to know if it can be done with DeepLX somehow.
Thank you.
Cool !
Maybe add a curl test to the README Or a makefile
WEB_URL=http://0.0.0.0:1199
@echo ""
@echo "GET"
curl -H "Content-Type: application/json" -X GET $(WEB_URL)/
@echo ""
@echo "POST"
curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST $(WEB_URL)/translate
you can put this into repo if you want.
everything works.... your description in the readme make it easy to make this work.
MAC only. can make it work on Windows and Linux with a bit more work.
BIN_NAME=deeplx
BIN_FSPATH_NAME=.bin
BIN_FSPATH=$(BIN_FSPATH_NAME)
BIN=$(BIN_FSPATH)/$(BIN_NAME)
build:
go build -o $(BIN) .
# .bin
build-cross:
chmod +x ./.cross_compile.sh
./.cross_compile.sh
# dist
WEB_URL=http://0.0.0.0:1199
build-run: $(BUILD)
$(BIN) -h
# http://localhost:1199/
run-test:
@echo ""
@echo "GET"
curl -H "Content-Type: application/json" -X GET $(WEB_URL)/
@echo ""
@echo "POST"
curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST $(WEB_URL)/translate
# {"code":200,"data":"hallo","id":157799001}%
## install to bin
BIN_MAC_INSTALL_FSPATH=/usr/local/bin/$(BIN_NAME)
BIN_MAC_INSTALL_WHICH=$(shell which $(BIN_NAME))
bin-print:
@echo ""
@echo "BIN_MAC_INSTALL_FSPATH: $(BIN_MAC_INSTALL_FSPATH)"
@echo "BIN_MAC_INSTALL_WHICH $(BIN_MAC_INSTALL_WHICH)"
bin-install:
#sudo mv deeplx_darwin_amd64 /usr/local/bin/deeplx
sudo cp $(BIN) $(BIN_MAC_INSTALL_FSPATH)
bin-install-del:
rm -f $(BIN_MAC_INSTALL_WHICH)
bin-start:
$(BIN_MAC_INSTALL_WHICH)
bin-test:
$(MAKE) run-test
### install as service
SERVICE_MAC_PLIST_NAME=me.missuo.deeplx.plist
SERVICE_MAC_LAUNCH_AGENTS_FSPATH=$(HOME)/Library/LaunchAgents
SERVICE_MAC=$(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)/$(SERVICE_MAC_PLIST_NAME)
service-print:
@echo ""
@echo "SERVICE_MAC_LAUNCH_AGENTS_FSPATH: $(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)"
@echo "SERVICE_MAC_PLIST_NAME: $(SERVICE_MAC_PLIST_NAME)"
@echo "SERVICE_MAC: $(SERVICE_MAC)"
@echo ""
service-print-list:
# check ours is there
ls -al $(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)
service-install:
cp $(REPO_NAME)/$(SERVICE_MAC_PLIST_NAME) $(SERVICE_MAC)
launchctl load $(SERVICE_MAC)
service-install-del:
launchctl unload $(SERVICE_MAC)
rm -f $(SERVICE_MAC)
service-test:
$(MAKE) run-test
service-start:
@echo "service start"
launchctl start $(SERVICE_MAC)
service-stop:
@echo "service stop"
launchctl stop $(SERVICE_MAC)
Is there the same rate limitation as on https://www.deepl.com/translator, or is it less?
I think it would be useful to add information about this in the README file.
可以添加 CORS headers 以允许浏览器插件或者 SPA 直接访问。http://localhost:<port>
是 potentially trustworthy origin,可以直接从 https 的 origin 访问。
func main() {
router := gin.Default()
// same as
// config := cors.DefaultConfig()
// config.AllowAllOrigins = true
// router.Use(cors.New(config))
router.Use(cors.Default())
router.Run()
}
使用这个sing-box 可以把机场订阅自建成代理ip池,类似于爬虫用的隧道代理,效果如下图:
Could you compile your proyect to Windows binary 32/64? I could install it as a service using NSSM.
I think is very interesting because i am using a plugin for Calibre for translate books and there is a issue for this theme.
bookfere/Ebook-Translator-Calibre-Plugin#6
The problem is that Windows users will be orphaned without your binary windows version.
请问没有windows端实现免费api的方法吗
DeepL may have updated the API logic, which may trigger an unusable situation at present.
Wait for the next version of DeepLX to be released.
From version v0.7.8 to the newest v0.8.0, deeplx
running with --logging-level info
will get a error message: flag provided but not defined: -logging-level
.
I had thought that cause the brew issue.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.