Giter VIP home page Giter VIP logo

deeplx's Introduction

deeplx's People

Contributors

asukaminato0721 avatar chenxiaolei avatar cijiugechu avatar dependabot[bot] avatar fossabot avatar ifyour avatar k024 avatar kabochar avatar legendleo avatar martialbe avatar missuo avatar sbilly avatar sjlleo avatar thedavidweng avatar yangchuansheng avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

deeplx's Issues

根据release更新docker镜像

您好,感谢您的杰出工作。我用docker版本部署了服务,用起来很爽,可以用您发布的版本去更新docker版本的服务吗?我用amd64版本的二进制文件替换了docker中根目录的deepl文件,发现并不能生效,服务不能使用了。

Interoperability support

Currently API is incompatible with neither DeepL nor Libretranslate one.

DeepL API: Uses FormData as body, also all paths starts with /v2/

POST /v2/translate HTTP/2
Content-Type: application/x-www-form-urlencoded

text=Hello%2C%20world!&target_lang=DE
200 OK
Content-Type: application/json

{
  "translations": [
    {
      "detected_source_language": "EN",
      "text": "Hallo, Welt!"
    }
  ]
}

Libretranslate API: Have other parameter names

POST /translate HTTP/2
Content-Type: application/json

{
    q: "Hello world!",
    source: "auto",
    target: "de"
}
200 OK
Content-Type: application/json

{
    "detectedLanguage": {
        "confidence": 92,
        "language": "en"
    },
    "translatedText": "Hallo Welt!"
}

code 200 但是 null

~ curl -X POST 'http://localhost:1188/translate' -H 'Content-Type:application/json' -d '{
  "text": "alternatives",
  "source_lang":"EN",
  "target_lang": "ZH"
}'
{"alternatives":null,"code":200,"data":"","id":8378427666}%
➜  ~ yay -Q deeplx-bin
deeplx-bin 0.8.0-1
Aug 19 12:22:42 pi-sxing deeplx[500]: [GIN] 2023/08/19 - 12:22:42 | 200 |  1.029138522s |       127.0.0.1 | POST     "/translate"

使用 yay -S deeplx-bin 安装在 manjaro arm

以前是可以用的

错误报告

macOS bob社区版 bob插件,配置好后总是提示:配置遇到未知错误,状态码:undefined

Make it usable as a library

It would be great if I can directly use this as a library instead of having this as an extra server that has to be run separately

In the function getTimeStamp in the calculation of i_count need to +1

In the function getTimeStamp in the calculation of i_count need to +1, because I download the code in the local start often reported too many requests, by checking the web version of deepl source code, found that the initial value of i_count should be 1, and then the statistics need to translate the text of the character "i The number of "i" in the text to be translated, with this total number plus 1 is the way to calculate the timestamp; I added this line of code and then recompiled without this problem.

0f16893c0a0e3af878580d6928ce39f3

Deployed to the edge function, the request always reports a 525 error.

Hello, after referring to your code implementation principle, I wrote a JS version which works fine when debugging locally. However, when I tried to deploy it to the edge function, it always reported a 525 error. May I ask why?

However, when I deployed your go version on the server, it worked normally and now I am seeking help.

// src/pages/api/translate.ts

import { NextResponse } from "next/server";
import type { NextRequest } from "next/server";

interface RequestParams {
  text: string;
  source_lang: string;
  target_lang: string;
}

interface ResponseParams {
  id: number;
  code: number;
  data: string;
}

async function queryAPI(data: RequestParams): Promise<ResponseParams> {
  const res = await fetch("https://www2.deepl.com/jsonrpc", {
    headers: {
      "Content-Type": "application/json; charset=utf-8",
    },
    method: "POST",
    body: buildBodyString(data),
  });

  if (res.ok) {
    const result = (await res.json()) as {
      jsonrpc: string;
      id: number;
      result: {
        texts: {
          text: string;
        }[];
      };
    };
    return {
      id: result.id,
      code: 200,
      data: result?.result?.texts?.[0]?.text,
    };
  }
  return {
    id: 42,
    code: res.status,
    data:
      res.status === 429
        ? "Too many requests, please try again later."
        : "Unknown error.",
  };
}

function buildRequestParams(sourceLang: string, targetLang: string) {
  return {
    jsonrpc: "2.0",
    method: "LMT_handle_texts",
    id: Math.floor(Math.random() * 100000 + 100000) * 1000,
    params: {
      texts: [{ text: "", requestAlternatives: 3 }],
      timestamp: 0,
      splitting: "newlines",
      lang: {
        source_lang_user_selected: sourceLang,
        target_lang: targetLang,
      },
    },
  };
}

function getCountOfI(translateText: string) {
  return translateText.split("i").length - 1;
}

function getTimestamp(iCount: number) {
  let ts = new Date().getTime();
  if (iCount !== 0) {
    iCount = iCount + 1;
    return ts - (ts % iCount) + iCount;
  } else {
    return ts;
  }
}

function buildBodyString(data: RequestParams) {
  const post_data = buildRequestParams(
    data.source_lang || "AUTO",
    data.target_lang || "AUTO"
  );
  post_data.params.texts = [{ text: data.text, requestAlternatives: 3 }];
  post_data.params.timestamp = getTimestamp(getCountOfI(data.text));
  let post_str = JSON.stringify(post_data);
  if (
    [0, 3].includes((post_data["id"] + 5) % 29) ||
    (post_data["id"] + 3) % 13 === 0
  ) {
    post_str = post_str.replace('"method":"', '"method" : "');
  } else {
    post_str = post_str.replace('"method":"', '"method": "');
  }

  return post_str;
}

export const config = {
  runtime: "edge",
};

export default async function MyEdgeFunction(request: NextRequest) {
  const req = (await request.json()) as RequestParams;
  const res = await queryAPI(req);
  return NextResponse.json(res);
}
curl --location 'https://deeplx-edge-api.vercel.app/api/translate' \
--header 'Content-Type: application/json' \
--data '{
    "text": "你好呀,请问你来自哪里",
    "source_lang": "auto",
    "target_lang": "en"
}'
{"id":42,"code":525,"data":"Unknown error."}
curl --location '127.0.0.1:3000/api/translate' \
--header 'Content-Type: application/json' \
--data '{
    "text": "请给我一个冰淇淋",
    "source_lang": "zh",
    "target_lang": "en"
}'
{"id":115601000,"code":200,"data":"Please give me an ice cream"}

Autodetect language

Would it be possible to use deepl's auto-detect language api for translation?

Windows+Zotero应该如何进行配置

您好,我想在Windows电脑上Zotero里面输入密钥配置deepl进行翻译,您的教程我没有太看懂,Windows具体应该下载哪一个文件(最新版两个Windows文件无法下载,浏览器会提示不安全),不知道您方不方便给一个Windows的配置教程,非常感谢

post method return 404

http://123.207.59.59:8100/translate/?"text"= "Hello World"&"source_lang"= "EN"&"target_lang"= "ZH"

{"code":404,"message":"No Translate Text Found"}

get method works.

Docker部署

我Docker上部署完后打不到,请问是什么问题?

Is there a GUI to use this in windows?

Is there a GUI to use this in windows?

I am not an advanced technical user, I just know that I pay every month for Deepl Pro, and I don't want to pay for it anymore, is there a way to use this in windows in an easy way?

插件运行原理

这个插件运行还是要用docker安装服务器才行是吗。还是说我自己操作有问题,他其实已经把服务器部署到本地了?

429 error even though it was my first time, I can use site's translation.

It was my first time, and I just tried this [tbb@hai ~]$ curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST 0.0.0.0:1188/translate {"code":429,"message":"Too Many Requests"}[tbb@hai ~]$ curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST 0.0.0.0:1188/translate

returns source_lang from result

server detects source lang if there is no source_lang param:

DeepLX/main.go

Line 155 in eb6afa8

lang := whatlanggo.DetectLang(translateText)

But It's not possible how to get what language the server detected from the results.
It would be nice to have a return source lang from result.

运行命令时出错

问题:
当我运行"launchctl load /Library/LaunchAgents/me.missuo.deeplx.plist"时,
终端输出:
Load failed: 5: Input/output error
Try running launchctl bootstrap as root for richer errors.
平台:m2 mac

error code 429 Too Many Requests

I wrote a program to translate SRT subtitle files, translating the input file line by line, but unfortunately I reached the free usage limit on DeepL not even finishing my first subtitle :P

I know this is not related to DeepLX, but the description misleaded me by saying: "deeplx is unlimited to the number of requests.", perhaps deeplx is unilimited, but as it has deepl as its backend, it has, in fact, limit for requests.

I've searched on the DeepL website and I didn't found the requests limit anywhere, but clearle it has one.
image

[ mac m1] brew services start owo-network/brew/deeplx failed

(base) ➜ ~ brew services start owo-network/brew/deeplx

Error: uninitialized constant Homebrew::Service::System
/opt/homebrew/Library/Taps/homebrew/homebrew-services/cmd/services.rb:61:in services' /opt/homebrew/Library/Homebrew/brew.rb:94:in

'

Too Many Requests

I have implemented in the translation comicrack plugin the call to your 'DeepL X' and it works very well but after a while of translating it has blocked me. In fact, it doesn't even let me translate from the web. I am attaching the log that the caliber plugin gives me.

======================== BEGIN LOG =============================
.
. A lot of translations before......
.

Original: It took the Torch soldiers no more than four or five seconds to get the slaver rolled over and haul Takahashi off him, but by then she’d pretty well transformed a third of his frontal lobes into hash. The autopsy ’bot later reported that she’d carved up part of the limbic system as well.
Traducción: Los soldados de la Antorcha no tardaron más de cuatro o cinco segundos en hacer rodar al esclavo y quitarse a Takahashi de encima, pero para entonces ya había transformado en hachís un tercio de sus lóbulos frontales. El robot de la autopsia informó más tarde de que también había cortado parte del sistema límbico.

Original: Modern medicine is not actually miraculous, although the term is often used. For all practical purposes, the man was gone before any aid could be given him.
No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 5 segundos.

Reintentando... (el tiempo de espera es de 300 segundos).
No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 10 segundos.

Reintentando... (el tiempo de espera es de 300 segundos).

No se pudieron recuperar los datos de la API del motor de traducción.
Se volverá a intentar en 30 segundos.

Reintentando... (el tiempo de espera es de 300 segundos).
Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.engines.base", line 81, in get_result
File "mechanize_mechanize.py", line 257, in open
File "mechanize_mechanize.py", line 313, in _mech_open
mechanize._response.get_seek_wrapper_class..httperror_seek_wrapper: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "calibre_plugins.ebook_translator.translation", line 59, in _translate
File "calibre_plugins.ebook_translator.engines.custom", line 120, in translate
File "calibre_plugins.ebook_translator.engines.base", line 88, in get_result
Exception: No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "runpy.py", line 196, in _run_module_as_main
File "runpy.py", line 86, in _run_code
File "site.py", line 83, in
File "site.py", line 78, in main
File "site.py", line 50, in run_entry_point
File "calibre\utils\ipc\worker.py", line 215, in main
File "calibre\utils\ipc\worker.py", line 150, in arbitrary_n
File "calibre_plugins.ebook_translator.convertion", line 124, in convert_book
File "calibre\ebooks\conversion\plumber.py", line 1281, in run
File "calibre_plugins.ebook_translator.convertion", line 120, in convert
File "calibre_plugins.ebook_translator.translation", line 109, in handle
File "calibre_plugins.ebook_translator.translation", line 87, in _handle
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 71, in _translate
File "calibre_plugins.ebook_translator.translation", line 63, in _translate
Exception: No se pudieron recuperar los datos de la API del motor de traducción. No se pudo analizar la respuesta devuelta. Datos sin procesar: HTTP Error 429: Too Many Requests

======================== END LOG =============================

Formal and informal

I would like to choose when to use formal or informal tone in those languages supporting those tones. Now it mixes those styles in the translation.

I know this is a feautre on DeepL Pro, but I would like to know if it can be done with DeepLX somehow.

Thank you.

curl test

Cool !

Maybe add a curl test to the README Or a makefile


WEB_URL=http://0.0.0.0:1199

@echo ""
@echo "GET"
curl -H "Content-Type: application/json" -X GET $(WEB_URL)/

@echo ""
@echo "POST"
curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST $(WEB_URL)/translate


makefile

you can put this into repo if you want.

everything works.... your description in the readme make it easy to make this work.

MAC only. can make it work on Windows and Linux with a bit more work.



BIN_NAME=deeplx
BIN_FSPATH_NAME=.bin
BIN_FSPATH=$(BIN_FSPATH_NAME)
BIN=$(BIN_FSPATH)/$(BIN_NAME)

build:
	go build -o $(BIN) .
	# .bin
build-cross:
	chmod +x ./.cross_compile.sh
	 ./.cross_compile.sh
	# dist

WEB_URL=http://0.0.0.0:1199
build-run: $(BUILD)
	$(BIN) -h
	# http://localhost:1199/

run-test:
	@echo ""
	@echo "GET"
	curl -H "Content-Type: application/json" -X GET $(WEB_URL)/

	@echo ""
	@echo "POST"
	curl -d '{"text":"hello", "source_lang":"EN", "target_lang":"DE"}' -H "Content-Type: application/json" -X POST $(WEB_URL)/translate
	# {"code":200,"data":"hallo","id":157799001}%  

## install to bin

BIN_MAC_INSTALL_FSPATH=/usr/local/bin/$(BIN_NAME)
BIN_MAC_INSTALL_WHICH=$(shell which $(BIN_NAME))

bin-print:
	@echo ""
	@echo "BIN_MAC_INSTALL_FSPATH:        $(BIN_MAC_INSTALL_FSPATH)"
	@echo "BIN_MAC_INSTALL_WHICH          $(BIN_MAC_INSTALL_WHICH)"

bin-install:
	#sudo mv deeplx_darwin_amd64 /usr/local/bin/deeplx
	sudo cp $(BIN) $(BIN_MAC_INSTALL_FSPATH)
bin-install-del:
	rm -f $(BIN_MAC_INSTALL_WHICH)
bin-start:
	$(BIN_MAC_INSTALL_WHICH)
bin-test:
	$(MAKE) run-test
	


### install as service

SERVICE_MAC_PLIST_NAME=me.missuo.deeplx.plist
SERVICE_MAC_LAUNCH_AGENTS_FSPATH=$(HOME)/Library/LaunchAgents
SERVICE_MAC=$(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)/$(SERVICE_MAC_PLIST_NAME)

service-print:
	@echo ""
	@echo "SERVICE_MAC_LAUNCH_AGENTS_FSPATH:    $(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)"
	@echo "SERVICE_MAC_PLIST_NAME:              $(SERVICE_MAC_PLIST_NAME)"
	@echo "SERVICE_MAC:                         $(SERVICE_MAC)"
	@echo ""
service-print-list:
	# check ours is there
	ls -al $(SERVICE_MAC_LAUNCH_AGENTS_FSPATH)
service-install:
	cp $(REPO_NAME)/$(SERVICE_MAC_PLIST_NAME) $(SERVICE_MAC)
	launchctl load $(SERVICE_MAC)
service-install-del:
	launchctl unload $(SERVICE_MAC)
	rm -f  $(SERVICE_MAC)
service-test:
	$(MAKE) run-test
service-start:
	@echo "service start"
	launchctl start $(SERVICE_MAC)
service-stop:
	@echo "service stop"
	launchctl stop $(SERVICE_MAC)

添加 CORS

可以添加 CORS headers 以允许浏览器插件或者 SPA 直接访问。http://localhost:<port>potentially trustworthy origin,可以直接从 https 的 origin 访问。

Gin CORS 中间件

func main() {
  router := gin.Default()
  // same as
  // config := cors.DefaultConfig()
  // config.AllowAllOrigins = true
  // router.Use(cors.New(config))
  router.Use(cors.Default())
  router.Run()
}

Windows version

Could you compile your proyect to Windows binary 32/64? I could install it as a service using NSSM.

I think is very interesting because i am using a plugin for Calibre for translate books and there is a issue for this theme.

bookfere/Ebook-Translator-Calibre-Plugin#6

The problem is that Windows users will be orphaned without your binary windows version.

Known Issues

DeepL may have updated the API logic, which may trigger an unusable situation at present.

Wait for the next version of DeepLX to be released.

flag provided but not defined: -logging-level

From version v0.7.8 to the newest v0.8.0, deeplx running with --logging-level info will get a error message: flag provided but not defined: -logging-level.

I had thought that cause the brew issue.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.