-
Couldn't load subscription status.
- Fork 25
Wma/shard levels #697
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Wma/shard levels #697
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remaining comments which cannot be posted as a review comment to avoid GitHub Rate Limit
JuliaFormatter
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 142 in 2289266
| VirtualShardLevel(device_2, lvl_2, sym, ptr, task, val, typeof(level_fill_value(Lvl)), Device, Lvl, Ptr, Task, Val) |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 147 in 2289266
| virtual_level_resize!(ctx, lvl::VirtualShardLevel, dims...) = (lvl.lvl = virtual_level_resize!(ctx, lvl.lvl, dims...); lvl) |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 154 in 2289266
| push_preamble!(ctx, quote |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Lines 157 to 158 in 2289266
| end) | |
| push_epilogue!(ctx, quote |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 160 in 2289266
| end) |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 165 in 2289266
| push_preamble!(ctx, |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 167 in 2289266
| lvl_2 = virtualize(ctx_2, :($(lvl.ex).val[$(ctx_2(get_task_num(get_task(ctx_2))))]), lvl.Lvl) #TODO should this virtualize the eltype of Val? |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 169 in 2289266
| end |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Lines 191 to 195 in 2289266
| push_preamble!(ctx, quote | |
| Finch.resize_if_smaller!($(lvl.task), $(ctx(pos_stop))) | |
| Finch.resize_if_smaller!($(lvl.ptr), $(ctx(pos_stop))) | |
| Finch.fill_range!($(lvl.task), $(ctx(pos_start)), $(ctx(pos_stop)), 0) | |
| end) |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Lines 220 to 221 in 2289266
| return Thunk( | |
| body = (ctx) -> begin |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Lines 231 to 232 in 2289266
| return Thunk( | |
| body = (ctx) -> begin |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 241 in 2289266
| end |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 244 in 2289266
| end |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 271 in 2289266
| return Thunk( |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 279 in 2289266
| $(contain(ctx_2->assemble_level!(ctx_2, lvl.lvl, value(qos_fill, Tp), value(qos_stop, Tp)), ctx)) |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 288 in 2289266
| body = (ctx) -> VirtualHollowSubFiber(lvl.lvl, value(qos), dirty), |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 295 in 2289266
| end |
[JuliaFormatter] reported by reviewdog 🐶
Finch.jl/src/tensors/levels/shard_levels.jl
Line 297 in 2289266
| end |
e0133b8 to
311c781
Compare
fc3e6be to
5ce43dd
Compare
| lvl = transfer(MultiChannelMemory(device, get_num_tasks(device)), lvl) | ||
| ShardLevel{Device}( | ||
| device, | ||
| transfer(MultiChannelMemory(device, get_num_tasks(device)), lvl), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we call transfer on lvl twice?
| task_2 = transfer(device, lvl.task) | ||
| qos_fill_2 = transfer(device, lvl.used) | ||
| qos_stop_2 = transfer(device, lvl.alloc) | ||
| return ShardLevel(lvl_2, ptr_2, task_2, qos_fill_2, qos_stop_2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need device argument?
Add ShardLevels and rework parallelism in Finch