Skip to content

Conversation

@willow-ahrens
Copy link
Member

Add ShardLevels and rework parallelism in Finch

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remaining comments which cannot be posted as a review comment to avoid GitHub Rate Limit

JuliaFormatter

[JuliaFormatter] reported by reviewdog 🐶

VirtualShardLevel(device_2, lvl_2, sym, ptr, task, val, typeof(level_fill_value(Lvl)), Device, Lvl, Ptr, Task, Val)


[JuliaFormatter] reported by reviewdog 🐶

virtual_level_resize!(ctx, lvl::VirtualShardLevel, dims...) = (lvl.lvl = virtual_level_resize!(ctx, lvl.lvl, dims...); lvl)


[JuliaFormatter] reported by reviewdog 🐶

push_preamble!(ctx, quote


[JuliaFormatter] reported by reviewdog 🐶

end)
push_epilogue!(ctx, quote


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶

push_preamble!(ctx,


[JuliaFormatter] reported by reviewdog 🐶

lvl_2 = virtualize(ctx_2, :($(lvl.ex).val[$(ctx_2(get_task_num(get_task(ctx_2))))]), lvl.Lvl) #TODO should this virtualize the eltype of Val?


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶

push_preamble!(ctx, quote
Finch.resize_if_smaller!($(lvl.task), $(ctx(pos_stop)))
Finch.resize_if_smaller!($(lvl.ptr), $(ctx(pos_stop)))
Finch.fill_range!($(lvl.task), $(ctx(pos_start)), $(ctx(pos_stop)), 0)
end)


[JuliaFormatter] reported by reviewdog 🐶

return Thunk(
body = (ctx) -> begin


[JuliaFormatter] reported by reviewdog 🐶

return Thunk(
body = (ctx) -> begin


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶

$(contain(ctx_2->assemble_level!(ctx_2, lvl.lvl, value(qos_fill, Tp), value(qos_stop, Tp)), ctx))


[JuliaFormatter] reported by reviewdog 🐶

body = (ctx) -> VirtualHollowSubFiber(lvl.lvl, value(qos), dirty),


[JuliaFormatter] reported by reviewdog 🐶


[JuliaFormatter] reported by reviewdog 🐶

@codecov
Copy link

codecov bot commented Mar 18, 2025

Codecov Report

Attention: Patch coverage is 25.18892% with 297 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/tensors/levels/shard_levels.jl 0.00% 259 Missing ⚠️
src/architecture.jl 0.00% 16 Missing ⚠️
src/tensors/levels/sparse_rle_levels.jl 52.00% 12 Missing ⚠️
ext/SparseArraysExt.jl 57.89% 8 Missing ⚠️
src/tensors/levels/dense_levels.jl 50.00% 2 Missing ⚠️
Files with missing lines Coverage Δ
src/Finch.jl 94.68% <ø> (ø)
src/tensors/levels/dense_rle_levels.jl 79.55% <100.00%> (-4.00%) ⬇️
src/tensors/levels/separate_levels.jl 69.93% <ø> (-4.20%) ⬇️
src/tensors/levels/sparse_band_levels.jl 75.40% <100.00%> (ø)
src/tensors/levels/sparse_bytemap_levels.jl 79.44% <100.00%> (-4.75%) ⬇️
src/tensors/levels/sparse_coo_levels.jl 83.40% <100.00%> (-4.81%) ⬇️
src/tensors/levels/sparse_dict_levels.jl 79.48% <100.00%> (-3.42%) ⬇️
src/tensors/levels/sparse_interval_levels.jl 70.23% <100.00%> (-7.74%) ⬇️
src/tensors/levels/sparse_list_levels.jl 89.09% <100.00%> (-3.80%) ⬇️
src/tensors/levels/sparse_vbl_levels.jl 77.02% <100.00%> (ø)
... and 5 more

... and 27 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@willow-ahrens willow-ahrens requested a review from Paramuths April 22, 2025 21:01
lvl = transfer(MultiChannelMemory(device, get_num_tasks(device)), lvl)
ShardLevel{Device}(
device,
transfer(MultiChannelMemory(device, get_num_tasks(device)), lvl),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we call transfer on lvl twice?

task_2 = transfer(device, lvl.task)
qos_fill_2 = transfer(device, lvl.used)
qos_stop_2 = transfer(device, lvl.alloc)
return ShardLevel(lvl_2, ptr_2, task_2, qos_fill_2, qos_stop_2)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need device argument?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants