PHP Client Library to interact with Ollama API.
The intention is to work with models on your Ollama setup and not the create a delete models. Therefore, this library will not implement any APIs to create, move or delete models. Here is a list of the APIs we intend to implement and the status of the implementation:
- Completion (without streaming support)
- Chat Completion (without streaming support)
- List local models
- Show Model Information
- List Running Models
- Version
The checkout the Ollama API Docs for more information and the APIs we might miss.
This package contains some low-level API libraries as well as a convenient API wrapper for all APIs.
composer install tredmann/php-ollama
The easiest way to ask the LLM things is to use the convenience wrapper:
use Ollama\Ollama;
$ollama = new Ollama(model: 'gemma2:latest');
echo $ollama->completion(prompt: 'What is the capitol of Germany?');
// The capital of Germany is **Berlin**.
It does have a ton of limitations, but for quick results it is easy to use. I would highly encourage to look into the low-level library.
use Ollama\Client\OllamaClient;
$client = new OllamaClient(
baseUrl: 'http://localhost:11434' // default
);
use Ollama\Api\Completion;
$completionApi = new Completion(client: $client);
use Ollama\Requests\CompletionRequest;
$request = new CompletionRequest(
model: 'phi3.5:latest',
prompt: 'What is the capitol of Germany?'
);
$response = $completionApi->getCompletion(request: $request);
echo $response->response;
// 'The capitol of Germany is Berlin.'