-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove eval
from at-memoize macro
#48
Comments
Hello,
It does not seem to me that multidispatch is handled. i will not address the Thru the example of factorial, i will try to make this code run CASE A - ONE CACHE PER FUNCconst _FACT_LUT = Dict{Type{<:Tuple},Dict}()
# memoizing only(methods(factorial, Tuple{Int64}))
_FACT_LUT[Tuple{Int64}] = Dict{Int64,Int64}()
let _inst_lut = _FACT_LUT[Tuple{Int64}]
_inst_lut[1] = 1
for n in 2:20
_inst_lut[n] = _inst_lut[n-1] * n
end
end
# memoizing only(methods(factorial, Tuple{Int128}))
_FACT_LUT[Tuple{Int128}] = Dict{Int128,Int128}()
let _inst_lut = _FACT_LUT[Tuple{Int128}]
_inst_lut[1] = 1
for n in 2:34
_inst_lut[n] = _inst_lut[n-1] * n
end
end CASE B - ONE CACHE FOR ALL FUNCconst _LUT = Dict{Type{<:Tuple},Dict}()
# ...
# memoizing only(methods(factorial, Tuple{Int64}))
sig = Tuple{typeof(factorial),Union{Int64, UInt64}}
sigarg = Union{Int64, UInt64}
_LUT[sig] = Dict{sigarg,sigarg}()
let _inst_lut = _LUT[sig]
# init (3rd arg ?)
_inst_lut[1] = 1
for n in 2:20
_inst_lut[n] = _inst_lut[n-1] * n
end
end
# memoizing only(methods(factorial, Tuple{Int128}))
sig = Tuple{typeof(factorial),Int128}
sigarg = Int128
_LUT[sig] = Dict{sigarg,sigarg}()
let _inst_lut = _LUT[sig]
# init (3rd arg ?)
_inst_lut[1] = 1
for n in 2:34
_inst_lut[n] = _inst_lut[n-1] * n
end
end to get that at macro
The main point is to have one subcache per method instance eg. dispatchtuple/tupletype; and a supercache to collect subcaches Does that help you ? |
after @memoized factorial(n::Int) =
n == 1 ?
1 :
factorial(n-1) * n
factorial(10) Try to generate something like _CACHE[Tuple{typeof(factorial), Int}] =
Dict{Tuple{Int64},Any}(
(7,) => 5040,
(4,) => 24,
(8,) => 40320,
(3,) => 6,
(6,) => 720,
(9,) => 362880,
(2,) => 2,
(5,) => 120,
(10,) => 3628800,
(1,) => 1)
|
Fix #48, at the cost of putting the variable in a poorly-performing global. Not sure if this is acceptable. It's frustrating that Julia seemingly lacks the tools to deal with this elegantly. - If `const` worked in a local function, we'd just put `const` and be done with it. - If `typeconst` existed for global variables, that would work too. Memoization.jl uses generated functions, which causes other problems. And it feels like the wrong solution too.
It also causes all instances of a closure to share the same cache, which is wrong: julia> g = []
for x in 1:10
push!(g,
@memoize f(y) = x+y)
end
julia> g[1](1)
2
julia> g[2](1)
2 |
Ah, this solution by @yuyichao might work out for us: julia> begin
local foo=Ref(1)
g() = foo[]
end
g (generic function with 1 method)
julia> @btime g()
1.538 ns (0 allocations: 0 bytes)
1 |
Fix JuliaCollections#48, at the cost of putting the variable in a poorly-performing global. Not sure if this is acceptable. It's frustrating that Julia seemingly lacks the tools to deal with this elegantly. - If `const` worked in a local function, we'd just put `const` and be done with it. - If `typeconst` existed for global variables, that would work too. Memoization.jl uses generated functions, which causes other problems. And it feels like the wrong solution too.
Right now, the
@memoize
macro creates the cache dictionary usingeval
:Memoize.jl/src/Memoize.jl
Lines 49 to 52 in 1709785
This is generally considered a bad idea in macros, and until recently, also caused an error here because the code was
eval
ing into the Memoize package (see #32).Right now, using
eval
gives the following behavior:Memoize.jl/test/runtests.jl
Lines 257 to 266 in 1709785
I made one attempt to simply use a different cache dictionary for each method of a given function (JuliaCollections:1709785...JuliaCollections:a9170e5). This mostly works, but as-is causes the test above to fail, because the old cache is not released when the method is overwritten.
There may be an easy way around this, but it's not obvious to me right now.
Thoughts and/or pull requests welcome!
Cc: @cstjean
The text was updated successfully, but these errors were encountered: