Custom Endpoints
MyLinks supports OpenAI API compatible services using the librechat.yaml configuration file.
This guide assumes you have already set up MyLinks using Docker, as shown in the Local Setup Guide.
Step 1. Create or Edit a Docker Override File
- Create a file named
docker-compose.override.ymlfile at the project root (if it doesn’t already exist). - Add the following content to the file:
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yamlLearn more about the Docker Compose Override File here.
Step 2. Configure librechat.yaml
-
Create a file named
librechat.yamlat the project root (if it doesn’t already exist). -
Add your custom endpoints: you can view compatible endpoints in the AI Endpoints section.
- The list is not exhaustive and generally every OpenAI API-compatible service should work.
- There are many options for Custom Endpoints. View them all here: Custom Endpoint Object Structure.
-
As an example, here is a configuration for both OpenRouter and Ollama:
version: 1.1.4 cache: true endpoints: custom: - name: "OpenRouter" apiKey: "${OPENROUTER_KEY}" baseURL: "https://openrouter.ai/api/v1" models: default: ["gpt-3.5-turbo"] fetch: true titleConvo: true titleModel: "current_model" summarize: false summaryModel: "current_model" forcePrompt: false modelDisplayLabel: "OpenRouter" - name: "Ollama" apiKey: "ollama" baseURL: "http://host.docker.internal:11434/v1/" models: default: [ "llama3:latest", "command-r", "mixtral", "phi3" ] fetch: true # fetching list of models is not supported titleConvo: true titleModel: "current_model"
Step 3. Configure .env File
- Edit your existing
.envfile at the project root- Copy
.env.exampleand rename to.envif it doesn’t already exist.
- Copy
- According to the config above, the environment variable
OPENROUTER_KEYis expected and should be set:
OPENROUTER_KEY=your_openrouter_api_keyNotes:
- As way of example, this guide assumes you have setup Ollama independently and is accessible to you at
http://host.docker.internal:11434- ”host.docker.internal” is a special DNS name that resolves to the internal IP address used by the host.
- You may need to change this to the actual IP address of your Ollama instance.
- In a future guide, we will go into setting up Ollama along with MyLinks.
Step 4. Run the App
- Now that your files are configured, you can run the app:
docker compose upOr, if you were running the app before, you can restart the app with:
docker compose restartNote: Make sure your Docker Desktop or Docker Engine is running before executing the command.
Conclusion
That’s it! You have now configured Custom Endpoints for your MyLinks instance.
Additional Links
Explore more about MyLinks and how to configure it to your needs.
- Updating MyLinks
- Instructions on how to update this setup with the latest changes to MyLinks.
- Configuring AI Providers
- Configure OpenAI, Google, Anthropic, and OpenAI Assistants
- Configuring a Custom Endpoint
- Configure services such as OpenRouter, Ollama, Mistral AI, Databricks, groq, and others.
- Click here for a list of known, compatible services.
- Environment Configuration
- Read for a comprehensive look at the
.envfile.
- Read for a comprehensive look at the
- librechat.yaml File Configuration
- Configure custom rate limiters, file outputs, and much more with the
librechat.yamlfile.
- Configure custom rate limiters, file outputs, and much more with the
- Ubuntu Docker Deployment Guide
- Read for advanced Docker setup on a remote/headless server.
- Setup the Azure OpenAI endpoint
- Configure multiple Azure regions and deployments for seamless use with MyLinks.