-
Couldn't load subscription status.
- Fork 25
Description
Hi,
Thanks for the great work on Finch.jl!
I’m trying to use Finch.jl as the execution backend for my project. My input tensors are model parameters trained in Python:
- Sparse tensors: stored in
.pklusing Python’ssparse.COOformat - Dense tensors: stored in
.npzusing NumPy arrays
My current idea is to store all raw arrays in .npz and keep tensor type metadata in a .json manifest, so after reading into Julia, I can convert them into desired tensor formats in Finch.
However, the existing Finch tensor constructors seem to create tensors from scratch, and I couldn’t find an API that directly wraps or converts external tensor formats (not already Finch tensors) into Finch-compatible formats—especially for large sparse data.
Question:
Is there an existing or recommended way to read large sparse tensors from formats like .npz/.json (or even Python’s sparse.COO) and convert them efficiently into Finch tensor formats?
Any tips, helper functions, or example code for this workflow would be greatly appreciated.
Thanks!