Translator++ Ver. 5.11.30 – Welcome to the era of AI

Good day!
Hey there, fellow translator!

How are you? I hope you enjoy your (incoming) holiday soon (while I am not).

Cloud Computing Platforms AI

As you may or may not know, AI technology keeps on growing, and so does Translator++. Translator++ is aware of this and is now focusing on AI translation. Translator++ supports the LiteLLM protocol, which means it can support 100+ LLM, such as OpenAI, Azure, AWS, Cohere, Anthropic, Huggingface, replicate, together_ai, epenrouter, google vertex_ai & palm, ai21, baseten, vllm, nlp_cloud, aleph alpha, petals, ollama, deepinfra, perplexity-ai, anyscale.

You need to install this plugin to perform CCP AI translate
Using Cloud Computing Platforms AI on Hugging Face

Offline-AI

If you are interested in torturing the GPU/CPU, or don’t care about the electricity bill, Translator++ now supports Offline computing AI from GPT4All. GPT4All is an ecosystem to run powerful and customized large language models that work locally on consumer-grade CPUs and any GPU. GPT4All has a wide range of pre-trained models that are on par with commercial ones. and it’s all free

Installed GPT4All plugin

Plus, it also supports several offline server protocols. In simple terms, Translator++ now supports almost all types of text generation back-end, whether it’s online or on-premises.

In layman term, From now on, Translator++ will supports (probably) any AI services you know.

Changelog

[5.11.30]

  • Add : New Addon: GPT4All – The free unlimited locally hosted automatic translation using GPT4All API
  • Add : New Addon: transLitellm – Automatic translation with 100+ LLM service provider out there. Including OpenAI, Azure, AWS, Cohere, Anthropic, Huggingface, replicate, together_ai, epenrouter, google vertex_ai & palm, ai21, baseten, vllm, nlp_cloud, aleph alpha, petals, ollama, deepinfra, perplexity-ai, anyscale, and more…
  • Add : WMIC library.
  • Add : Python library’s configuration.
  • Add : Python library: Installation through requirements.txt is cached for better performance

[5.11.22]
Update: transEz Ver. 0.2.1

  • Add : New Addon: KoboldAI – The free unlimited locally hosted automatic translation using KoboldAI/KoboldCPP
    Update: ResourceParser ver. 0.1.3
  • Fix : ResourceParser – Unable to inject translation
  • Fix : Inject translation window’s Source material field is not updated on change
  • Fix : Added default value for source material location and project’s name

[5.11.15]

  • Fix : “Translate here using…” menu now scrollable
  • Fix : Fix a bug that has caused several keywords translated into function String() { [native code] }. This known has caused error on RPGMaker games.
  • Update : EnigmaVBUnpacker ver. 0.61
  • Update : VNTextPatch ver. 0.7 – Added option to use the old xlsx interface to parse the file
  • Fix : Addon’s onBeforeInstall and onAfterInstall event will executed with await syntax under the hood.
  • Fix : Addon installation bug fixes
  • Add : transEz addon. Japanese <-> Korean translator using ezTransXP (no python installation needed)

[5.11.9]

  • Add : Windows resource translator ver. 0.1 – Supports .exe, .dll, .res, .rc
  • Add : added common.makeTempDir() to generate temporary dir.

[5.11.1]

  • Fix : Form editor : Submitted value is blank
  • Update : WolfJS ver. 1.3
  • Fix : WolfJS : Unable to apply translation to some of the WolfRPG game ver 3.
  • Add : WolfJS : Added “Maximum number of line in a message box” on inject dialong window
  • Add : WolfJS : Added “Maximum number of line in a message box” on export dialong window