Your own deepl server via fastapi, cross-platform (Windows/Linux/MacOs) with API for OmegaT
- Create a virual environment: optional but recommended
e.g.,
# Linux and friends python3.7 -m venv .venv source .venv/bin/activate # Windows # py -3.7 -m venv .venv # .venv\Scripts\activate
pip install deepl-fastapi
or (if your use poetry)
poetry add deepl-fastapi
or
pip install git+https://github.com/ffreemt/deepl-fastapi.git
or
- Clone the repo https://github.com/ffreemt/deepl-fastapi.git
and
git clone https://github.com/ffreemt/deepl-fastapi.git
cd deepl-fastapi
- `pip install -r requirements.txt
- or
poetry install
- or
- Start the server
Use uvicorn directly (note the deepl_server
module, not run_uvicorn
)
uvicorn deepl_fastapi.deepl_server:app
or
deepl-fastapi
# this option is available only if installed via pip install or poetry add
or
python3.7 -m deepl_fastapi.run_uvicorn
or run the server on the external net, for example at port 9888
uvicorn deepl_fastapi.deepl_server:app --reload --host 0.0.0.0 --port 9888
- Explore and consume
Point your browser to http://127.0.0.1:8000/text/?q=test&to_lang=zh
Or in python code (pip install requests
first)
import requests
# get
url = "http://127.0.0.1:8000/text/?q=test me&to_lang=zh"
print(requests.get(url).json())
# {'q': 'test me', 'from_lang': None, 'to_lang': 'zh',
# 'trtext': '考我 试探我 测试我 试探'}
# post
text = "test this and that"
data = {"text": text, "to_lang": "zh"}
resp = requests.post("http://127.0.0.1:8000/text", json=data)
print(resp.json())
# {'q': {'text': 'test this and that', 'from_lang': None, 'to_lang': 'zh', 'description': None},
# 'result': '试探 左右逢源 检验 审时度势'}
- Copy
omegat-plugin-fake-mt-1.0.0.jar
(available at https://github.com/briacp/omegat-plugin-fake-mt) to OmegaT\plugins (e.g., C:\Program Files\OmegaT\plugins)
RunOmegaT
and setup omegat-plugin-fake-mt
OmegaT/Preferences/Machine Translation/Fake MT/Configure
Name: Fake Deepl MT
URL: http://localhost:8000/text
Source Parameter: from_lang
Target Parameter: to_lang
Text Parameter: q