Skip to content

Authentication

AI Foundation Services uses API keys for authentication. All requests must include your API key in the Authorization header.

Get started immediately with a free trial key:

  1. Visit the API Key Portal
  2. Create an account and generate your API key
  3. Your trial key gives you access to all available models

For production workloads, purchase via the T-Cloud Marketplace.

Store your API key as an environment variable — never hardcode it in your source code.

Terminal window
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://llm-server.llmhub.t-systems.net/v2"

To persist across sessions, add these lines to your ~/.zshrc or ~/.bashrc file.

When OPENAI_API_KEY and OPENAI_BASE_URL are set, the OpenAI SDKs pick them up automatically:

from openai import OpenAI
client = OpenAI() # No need to pass api_key or base_url

You can also pass them explicitly:

from openai import OpenAI
client = OpenAI(
api_key="your_api_key_here",
base_url="https://llm-server.llmhub.t-systems.net/v2",
)

Include the API key in the Authorization header:

Terminal window
curl -X POST "https://llm-server.llmhub.t-systems.net/v2/chat/completions" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{"model": "Llama-3.3-70B-Instruct", "messages": [{"role": "user", "content": "Hello"}]}'

All API requests go to:

https://llm-server.llmhub.t-systems.net/v2

This is the OpenAI-compatible endpoint. For the full API specification, see the API Reference.

  • Never commit API keys to version control. Use .env files and add them to .gitignore.
  • Use environment variables in production rather than hardcoding keys.
  • Rotate keys regularly via the API Key Portal.
  • Monitor usage through the API Key Portal to track token consumption and costs.