578
Proton’s Lumo AI chatbot: not end-to-end encrypted, not open source
(pivot-to-ai.com)
This is a most excellent place for technology news and articles.
OK, so I just checked the page:
https://lumo.proton.me/guest
Looks like a generic Open Web UI instance, much like Qwen's: https://openwebui.com/
Based on this support page, they are using open models and possibly finetuning them:
https://proton.me/support/lumo-privacy
But this information is hard to find, and they aren't particularly smart models, even for 32B-class ones.
Still... the author is incorrect, they specify how long requests are kept:
But it also mentions that, as is a necessity now, they are decrypted on the GPU servers for processing. Theoretically they could hack the input/output layers and the tokenizer into a pseudo E2E encryption scheme, but I haven't heard of anyone doing this yet... And it would probably be incompatible with their serving framework (likely vllm) without some crack CUDA and Rust engineers (as you'd need to scramble the text and tokenize/detokenize it uniquely for scrambled LLM outer layers for each request).
They are right about one thing: Proton all but advertise Luma as E2E when that is a lie. Per its usual protocol, Open Web UI will send the chat history for that particular chat to the server for each requests, where it is decoded and tokenized. If the GPU server were to be hacked, it could absolutely be logged and intercepted.