ollama-d 0.3.2

D bindings for the Ollama API


To use this package, run the following command in your project's root directory:

Manual usage
Put the following dependency into your project's dependences section:

ollama-d

Static Badge Latest release Artifacts

D language bindings for the Ollama REST API - Seamless integration with local AI models

Features

  • Text generation with native and OpenAI-compatible endpoints
  • Chat interactions with local AI models
  • Model management (list, create, show, pull, push, copy, delete models)
  • Configurable timeout settings
  • Simple and intuitive API design using std.net.curl and std.json
  • Server version retrieval
  • OpenAI-compatible API endpoints

Prerequisites

  • D compiler installed on your system
  • Ollama server running locally (default: "http://127.0.0.1:11434")
  • Installed AI model (e.g., "llama3.2")

Quick Examples

import ollama;
import std.stdio;

void main() {
    // Initialize Ollama client on localhost at port 11434
    auto client = new OllamaClient();

    // Text generation
    auto generateResponse = client.generate("llama3.2", "Why is the sky blue?");
    writeln("Generate Response: ", generateResponse["response"].str);

    // Chat interaction
    Message[] messages = [Message("user", "Hello, how are you?")];
    auto chatResponse = client.chat("llama3.2", messages);
    writeln("Chat Response: ", chatResponse["message"]["content"].str);

    // List available models
    auto models = client.listModels();
    writeln("Available Models: ", models);

    // OpenAI-compatible chat completions
    auto openaiResponse = client.chatCompletions("llama3.2", messages, 50, 0.7);
    writeln("OpenAI-style Response: ", openaiResponse["choices"][0]["message"]["content"].str);

    // Get server version
    auto version = client.getVersion();
    writeln("Ollama Server Version: ", version);
}

Additional Methods

  • generate(): Text generation with custom options
  • chat(): Conversational interactions
  • listModels(): Retrieve available models
  • showModel(): Get detailed model information
  • createModel(): Create custom models
  • copy(): Copy existing models
  • deleteModel(): Remove models from server
  • pull(): Download models from registry
  • push(): Upload models to registry
  • chatCompletions(): OpenAI-compatible chat endpoint
  • completions(): OpenAI-compatible text completion
  • getModels(): List models in OpenAI-compatible format
  • setTimeOut(): Configure request timeout duration
  • getVersion(): Retrieve Ollama server version

License

MIT License

Authors:
  • Matheus Catarino França
Sub packages:
ollama-d:simple, ollama-d:coder
Dependencies:
none
Versions:
0.3.2 2025-Mar-22
0.3.1 2025-Mar-21
0.3.0 2025-Mar-21
0.2.0 2025-Mar-20
0.1.0 2025-Mar-19
Show all 5 versions
Download Stats:
  • 0 downloads today

  • 0 downloads this week

  • 7 downloads this month

  • 7 downloads total

Score:
0.8
Short URL:
ollama-d.dub.pm