Skip to content

Running on Linux #48

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

pd95
Copy link
Contributor

@pd95 pd95 commented Mar 5, 2025

This PR tries to make OllamaKit available on non-Apple platforms by removing the dependency on "Combine" where not available.

Further, as the URLSession.bytes() method has not (yet?) been implemented in https://github.com/swiftlang/swift-corelibs-foundation, I have reimplemented the OKHTTPClient.stream() async method.

@pd95 pd95 mentioned this pull request Mar 5, 2025
@pd95 pd95 marked this pull request as draft March 6, 2025 21:26
@pd95 pd95 force-pushed the feature/make-Combine-optional branch from 003b92d to c97a887 Compare March 10, 2025 22:16
@pd95
Copy link
Contributor Author

pd95 commented Mar 10, 2025

The streaming API still does not work on Linux according to my tests.

@pd95 pd95 force-pushed the feature/make-Combine-optional branch from 4a37088 to 05c7d8c Compare March 14, 2025 10:37
@pd95
Copy link
Contributor Author

pd95 commented Mar 14, 2025

@danpalmer: Today I learned about using the Swift docker image to build for Linux on my Mac and thanks to an AI assistant I was able to identify the reason why streaming didn't work on Linux. 🥳

So please test this branch and suggest improvements.

@kevinhermawan: Wouldn't it make sense to have a single URLSession in OKHTTPClient to for reuse (and for capturing the partial/streaming results)?

@pd95
Copy link
Contributor Author

pd95 commented Mar 14, 2025

For those interested in testing locally on a mac:

mkdir OllamaSimple
cd OllamaSimple
swift package init --type executable

Add a dependency to this branch of OllamaKit and to the application:

swift package add-dependency https://github.com/pd95/OllamaKit --branch "feature/make-Combine-optional"

Edit the "Package.swift" to add the OllamaKit dependency and a macOS 12 platform requirement.
Your "Package.swift" might look now like below:

// swift-tools-version: 6.0
// The swift-tools-version declares the minimum version of Swift required to build this package.

import PackageDescription

let package = Package(
    name: "OllamaSimple",
    platforms: [.macOS(.v12)],
    dependencies: [
        .package(url: "https://github.com/pd95/OllamaKit", branch: "feature/make-Combine-optional"),
    ],
    targets: [
        // Targets are the basic building blocks of a package, defining a module or a test suite.
        // Targets can depend on other targets in this package and products from dependencies.
        .executableTarget(
            name: "OllamaSimple", 
            dependencies: ["OllamaKit"]),
    ]
)

Edit the "main.swift" source and paste the code below.

@preconcurrency import Foundation    // Preconcurrency is neaded to use fflush()
import OllamaKit

let baseURL: URL

// When running in a Docker container, we have to connect to the real host using a special DNS entry
if FileManager.default.fileExists(atPath: "/.dockerenv") {
    print("Running in docker, connecting to host.docker.internal")
    baseURL = URL(string: "http://host.docker.internal:11434")!
} else {
    print("Not running in docker, connecting to localhost")
    baseURL = URL(string: "http://localhost:11434")!
}

let ollamaKit = OllamaKit(baseURL: baseURL)
do {
    print("models:")
    fflush(stdout) // Ensure output is flushed

    let response = try await ollamaKit.models()
    for model in response.models {
        print("  ", model.name)
    }

    print("generating")
    fflush(stdout) // Ensure output is flushed

    let stream = ollamaKit.generate(data: OKGenerateRequestData(model: "phi4:latest", prompt: "Why is the sky blue? Explain it for a 5 year old child."))
    for try await data in stream {
        print(data.response, terminator: "")
        fflush(stdout) // Ensure output is flushed
    }
    print("\ndone")
    fflush(stdout) // Ensure output is flushed
} catch {
    print("Failed", error.localizedDescription)
}

This test case verifies the OllamaKit API integration. It makes two primary API calls:

  1. It fetches the available models from the Ollama API and prints their names.
  2. It sends a generation request with a specific prompt and model, then streams and prints the output in real time.

The test confirms that the Ollama API calls (both model retrieval and streaming text generation) work as expected.

You can now run this app on your Mac using the following command: (make sure Ollama is running locally!)

swift run

If you want to test using Linux and have docker installed. Use the following command:

docker run --rm -v "$PWD:/code" -w /code swift:latest swift run

This will install the latest swift docker image, mount the current directory ($PWD) as /code and then run the command "swift run" in it.

@pd95 pd95 changed the title Make Combine optional to allow running on Linux Running on Linux Mar 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant