Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!


Deploying Models with Xinference
Latest   Machine Learning

Deploying Models with Xinference

Author(s): zhaozhiming

Originally published on Towards AI.

Today, let’s explore Xinference, a deployment and inference tool for Large Language Models (LLMs), characterized by its quick deployment, ease of use, efficient inference, support for various open-source models, and provision of both a WebGUI interface and API endpoints for convenient model deployment and inference. Let’s dive into Xinference together!

Xorbits Inference (Xinference) is a powerful and comprehensive distributed inference framework suitable for various models. With Xinference, you can effortlessly deploy your own or cutting-edge open-source models with just one click. Whether you’re a researcher, developer, or data scientist, Xinference connects you with the latest AI models, unlocking more possibilities. Below is a comparison of Xinference with other model deployment and inference tools:

Xinference supports two installation methods: Docker image and local installation. For those interested in the Docker method, please refer to the official Docker Installation Documentation. Here, we will focus on local installation.

First, install Xinference’s Python dependencies:

pip install "xinference[all]"

Since Xinference depends on many third-party libraries, the installation might take some time. Once completed, you can start the Xinference service with the following command:


Upon successful startup, access the Xinference WebGUI interface via http://localhost:9777.

Note: During the installation of Xinference, it might install a different version of PyTorch (due to its dependency on… Read the full blog for free on Medium.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓