-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Add SYCL feature flags to rllm-llamacpp build (To add support for Intel GPUs) #96
base: main
Are you sure you want to change the base?
[FEATURE] Add SYCL feature flags to rllm-llamacpp build (To add support for Intel GPUs) #96
Conversation
@microsoft-github-policy-service agree |
rllm/llama-cpp-low/build.rs
Outdated
|
||
let dirs = [ | ||
"/opt/intel/oneapi/compiler/latest/lib", | ||
"/opt/intel/oneapi/mkl/latest/lib", | ||
//"/opt/intel/oneapi/dnnl/latest/lib", | ||
]; | ||
|
||
// *.a => static | ||
// *.so => dynamic | ||
for dir in dirs.iter() { | ||
println!("cargo:rustc-link-search={}", dir); | ||
for file in std::fs::read_dir(dir).unwrap() { | ||
let file = file.unwrap(); | ||
let file_name = file.file_name(); | ||
let file_name = file_name.to_str().unwrap(); | ||
if !file_name.starts_with("lib") { continue; } | ||
if file_name.contains("lp64") && !file_name.contains("ilp64") { continue; } | ||
if file_name.contains("seq") { continue; } | ||
if file_name == "libmkl_gnu_thread.so" { continue; } | ||
let file_name = file_name.trim_start_matches("lib"); | ||
|
||
if file_name.ends_with(".so") { | ||
let file_name = &file_name[..file_name.len()-3]; | ||
println!("cargo:rustc-link-lib=dylib={}", file_name); | ||
} else if file_name.ends_with(".a") { | ||
let file_name = &file_name[..file_name.len()-2]; | ||
println!("cargo:rustc-link-lib=static={}", file_name); | ||
} | ||
} | ||
} | ||
//panic!("stop here"); | ||
|
||
//println!("cargo:rustc-link-search=native=/opt/intel/oneapi/compiler/latest/lib"); | ||
//println!("cargo:rustc-link-lib=intlc"); | ||
//println!("cargo:rustc-link-lib=svml"); | ||
//println!("cargo:rustc-link-lib=sycl"); | ||
//println!("cargo:rustc-link-search=native=/opt/intel/oneapi/mkl/latest/lib"); | ||
//println!("cargo:rustc-link-lib=mkl_core"); | ||
//println!("cargo:rustc-link-lib=mkl_sycl_blas"); | ||
//println!("cargo:rustc-link-lib=mkl_sycl"); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently struggling with dynamically linken the libraries but also resolving only needed without duplicates and with resolved dependencies. Don't really know whats a good way here...
The basic build works now. 🚀 Known Issues:
|
The goal of this pull request is to add support for Intel GPU to the build script of rllm llama.cpp
This is done by
The code will be tested on a machine with Intel 13Gen Processor & Intel Arc A770 GPU