Tether launches the BitNet LoRA framework to support large model training on mobile devices.

PANews reported on March 17 that Tether's QVAC Fabric has launched the world's first cross-platform LoRA tuning framework for Microsoft BitNet (One-Bit LLM), significantly reducing the memory and computing power requirements for training large models. The framework supports LoRA tuning and inference acceleration on Intel, AMD, Apple Silicon M-series, and mobile GPUs (such as Adreno, Mali, and Apple Bionic).

Share to:

Author: PA一线

This content is for market information only and is not investment advice.

Follow PANews official accounts, navigate bull and bear markets together