-
Notifications
You must be signed in to change notification settings - Fork 37
Running on Linux #48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Running on Linux #48
Conversation
003b92d
to
c97a887
Compare
The streaming API still does not work on Linux according to my tests. |
4a37088
to
05c7d8c
Compare
@danpalmer: Today I learned about using the Swift docker image to build for Linux on my Mac and thanks to an AI assistant I was able to identify the reason why streaming didn't work on Linux. 🥳 So please test this branch and suggest improvements. @kevinhermawan: Wouldn't it make sense to have a single |
For those interested in testing locally on a mac:
Add a dependency to this branch of OllamaKit and to the application:
Edit the "Package.swift" to add the OllamaKit dependency and a macOS 12 platform requirement. // swift-tools-version: 6.0
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "OllamaSimple",
platforms: [.macOS(.v12)],
dependencies: [
.package(url: "https://github.com/pd95/OllamaKit", branch: "feature/make-Combine-optional"),
],
targets: [
// Targets are the basic building blocks of a package, defining a module or a test suite.
// Targets can depend on other targets in this package and products from dependencies.
.executableTarget(
name: "OllamaSimple",
dependencies: ["OllamaKit"]),
]
) Edit the "main.swift" source and paste the code below. @preconcurrency import Foundation // Preconcurrency is neaded to use fflush()
import OllamaKit
let baseURL: URL
// When running in a Docker container, we have to connect to the real host using a special DNS entry
if FileManager.default.fileExists(atPath: "/.dockerenv") {
print("Running in docker, connecting to host.docker.internal")
baseURL = URL(string: "http://host.docker.internal:11434")!
} else {
print("Not running in docker, connecting to localhost")
baseURL = URL(string: "http://localhost:11434")!
}
let ollamaKit = OllamaKit(baseURL: baseURL)
do {
print("models:")
fflush(stdout) // Ensure output is flushed
let response = try await ollamaKit.models()
for model in response.models {
print(" ", model.name)
}
print("generating")
fflush(stdout) // Ensure output is flushed
let stream = ollamaKit.generate(data: OKGenerateRequestData(model: "phi4:latest", prompt: "Why is the sky blue? Explain it for a 5 year old child."))
for try await data in stream {
print(data.response, terminator: "")
fflush(stdout) // Ensure output is flushed
}
print("\ndone")
fflush(stdout) // Ensure output is flushed
} catch {
print("Failed", error.localizedDescription)
} This test case verifies the OllamaKit API integration. It makes two primary API calls:
The test confirms that the Ollama API calls (both model retrieval and streaming text generation) work as expected. You can now run this app on your Mac using the following command: (make sure Ollama is running locally!)
If you want to test using Linux and have docker installed. Use the following command:
This will install the latest swift docker image, mount the current directory ($PWD) as /code and then run the command "swift run" in it. |
This PR tries to make OllamaKit available on non-Apple platforms by removing the dependency on "Combine" where not available.
Further, as the
URLSession.bytes()
method has not (yet?) been implemented in https://github.com/swiftlang/swift-corelibs-foundation, I have reimplemented theOKHTTPClient.stream()
async method.