Skip to content

Commit

Permalink
Merge pull request #48 from PumasAI/densesubset
Browse files Browse the repository at this point in the history
tests pass locally...
  • Loading branch information
chriselrod authored Apr 14, 2022
2 parents 8072cc9 + bf268e9 commit 913cee0
Show file tree
Hide file tree
Showing 15 changed files with 536 additions and 99 deletions.
8 changes: 6 additions & 2 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ defaults:
shell: bash
jobs:
coverage:
name: coverage=true/Julia ${{ matrix.version }}/${{ matrix.os }}/${{ matrix.arch }}/${{ github.event_name }}
name: coverage=true/Julia ${{ matrix.version }}/${{ matrix.threads }} threads/${{ matrix.os }}/${{ matrix.arch }}/${{ github.event_name }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
Expand All @@ -23,6 +23,9 @@ jobs:
- ubuntu-latest
version:
- 'nightly' # coverage fast on nightly
threads:
- '3'
- '4'
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
Expand All @@ -44,7 +47,8 @@ jobs:
with:
coverage: true
env:
JULIA_NUM_THREADS: 2
JULIA_NUM_THREADS: ${{ matrix.threads }}
JULIA_CPU_THREADS: ${{ matrix.threads }}
- uses: julia-actions/julia-processcoverage@v1
- uses: codecov/codecov-action@v1
with:
Expand Down
4 changes: 3 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ version = "0.2.1"
ArrayInterface = "4fba245c-0d91-5ea0-9b3e-6abc04ee57a9"
CPUSummary = "2a0fbf3d-bb9c-48f3-b0a9-814d99fd7ab9"
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
CloseOpenIntervals = "fb6a15b2-703c-40df-9091-08a04967cfa9"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
HostCPUFeatures = "3e5b6fbb-0976-4d2c-9146-d79de83f2fb0"
IfElse = "615f187c-cbe4-4ef1-ba3b-2fcf58d6d173"
Expand All @@ -27,6 +28,7 @@ VectorizedRNG = "33b4df10-0173-11e9-2a0c-851a7edac40e"
ArrayInterface = "3, 5"
CPUSummary = "0.1.8"
ChainRulesCore = "0.8, 0.9, 0.10, 1"
CloseOpenIntervals = "0.1.6"
ForwardDiff = "0.10"
HostCPUFeatures = "0.1.7"
IfElse = "0.1"
Expand All @@ -37,7 +39,7 @@ Polyester = "0.4, 0.5, 0.6"
SIMDTypes = "0.1"
SLEEFPirates = "0.6"
Static = "0.3, 0.4, 0.6"
StrideArraysCore = "0.2.3, 0.3"
StrideArraysCore = "0.3.2"
UnPack = "1"
VectorizationBase = "0.21.28"
VectorizedRNG = "0.2.13"
Expand Down
2 changes: 1 addition & 1 deletion docs/src/examples/mnist.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ a number of rows equal to the length of the parameter vector `p`, and one column
per thread. For example:
```julia
estimated_num_cores = (Sys.CPU_THREADS ÷ ((Sys.ARCH === :x86_64) + 1));
G = similar(p, length(p), min(Threads.nthreads(), estimated_num_cores);
G = SimpleChains.alloc_threaded_grad(lenetloss);
```
Here, we're estimating that the number of physical cores is half the number of threads
on an `x86_64` system, which is true for most -- but not all!!! -- of them.
Expand Down
66 changes: 66 additions & 0 deletions docs/src/examples/smallmlp.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,69 @@
# Small Multi-Layer Perceptron

Here, we'll fit a simple network:
```julia
using SimpleChains

mlpd = SimpleChain(
static(4),
TurboDense(tanh, 32),
TurboDense(tanh, 16),
TurboDense(identity, 4)
)
```

Our goal here will be to try and approximate the matrix exponential:
```julia
function f(x)
N = Base.isqrt(length(x))
A = reshape(view(x, 1:N*N), (N,N))
expA = exp(A)
vec(expA)
end

T = Float32;
X = randn(T, 2*2, 10_000);
Y = reduce(hcat, map(f, eachcol(X)));
Xtest = randn(T, 2*2, 10_000);
Ytest = reduce(hcat, map(f, eachcol(Xtest)));
```

Now, to train our network:
```julia
@time p = SimpleChains.init_params(mlpd);
G = SimpleChains.alloc_threaded_grad(mlpd);

mlpdloss = SimpleChains.add_loss(mlpd, SquaredLoss(Y));
mlpdtest = SimpleChains.add_loss(mlpd, SquaredLoss(Ytest));

report = let mtrain = mlpdloss, X=X, Xtest=Xtest, mtest = mlpdtest
p -> begin
let train = mlpdloss(X, p), test = mlpdtest(Xtest, p)
@info "Loss:" train test
end
end
end

report(p)
for _ in 1:3
@time SimpleChains.train_unbatched!(
G, p, mlpdloss, X, SimpleChains.ADAM(), 10_000
);
report(p)
end
```
I get
```julia
┌ Info: Loss:
│ train = 0.012996411f0
└ test = 0.021395735f0
0.488138 seconds
┌ Info: Loss:
│ train = 0.0027068993f0
└ test = 0.009439239f0
0.481226 seconds
┌ Info: Loss:
│ train = 0.0016358295f0
└ test = 0.0074498975f0
```

15 changes: 6 additions & 9 deletions examples/mnist_lenet.jl
Original file line number Diff line number Diff line change
Expand Up @@ -39,18 +39,14 @@ lenetloss = SimpleChains.add_loss(lenet, LogitCrossEntropyLoss(ytrain));
g = similar(p);
@time valgrad!(g, lenetloss, xtrain, p)

G = similar(
p,
length(p),
min(Threads.nthreads(), (Sys.CPU_THREADS ÷ ((Sys.ARCH === :x86_64) + 1))),
);
G = SimpleChains.alloc_threaded_grad(lenetloss);

@time SimpleChains.train_batched!(G, p, lenetloss, xtrain, SimpleChains.ADAM(3e-4), 10);

SimpleChains.accuracy_and_loss(lenetloss, xtrain, p),
SimpleChains.accuracy_and_loss(lenetloss, xtest, ytest, p)

SimpleChains.init_params!(lenet, p);
# SimpleChains.init_params!(lenet, p);
@time SimpleChains.train_batched!(G, p, lenetloss, xtrain, SimpleChains.ADAM(3e-4), 10);
SimpleChains.accuracy_and_loss(lenetloss, xtrain, p),
SimpleChains.accuracy_and_loss(lenetloss, xtest, ytest, p)
Expand Down Expand Up @@ -143,8 +139,9 @@ end
function loaders(xtrain, ytrain, xtest, ytest, args)
ytrain, ytest = onehotbatch(ytrain, 1:10), onehotbatch(ytest, 1:10)

train_loader = DataLoader((xtrain, ytrain), batchsize = args.batchsize, shuffle = true)
test_loader = DataLoader((xtest, ytest), batchsize = args.batchsize)
train_loader =
DataLoader((device(xtrain), device(ytrain)), batchsize = args.batchsize, shuffle = true)
test_loader = DataLoader((device(xtest), device(ytest)), batchsize = args.batchsize)

return train_loader, test_loader
end
Expand All @@ -166,7 +163,7 @@ function eval_loss_accuracy(loader, model, device)
acc += sum(onecold(ŷ |> cpu) .== onecold(y |> cpu))
ntot += size(x)[end]
end
return (loss = l / ntot |> round4, acc = acc / ntot * 100 |> round4)
return (acc = acc / ntot * 100 |> round4, loss = l / ntot |> round4)
end

## utility functions
Expand Down
10 changes: 6 additions & 4 deletions src/SimpleChains.jl
Original file line number Diff line number Diff line change
Expand Up @@ -25,17 +25,19 @@ using ArrayInterface:
using SIMDTypes: Bit
using VectorizationBase: align, relu, stridedpointer, AbstractSIMD
using HostCPUFeatures: static_sizeof, register_size, register_count, static_sizeof
using CPUSummary: cache_linesize
using CPUSummary: cache_linesize, num_threads, num_cores
using LayoutPointers: bytestrideindex, stridedpointer, zero_offsets
using Static: One
using Static: One, lt
using CloseOpenIntervals: CloseOpen
using StrideArraysCore: zview
import ManualMemory: preserve_buffer
using IfElse: ifelse
import Random
import ChainRulesCore
import ForwardDiff

using LoopVectorization: matmul_params, CloseOpen, @turbo
using LoopVectorization: matmul_params, @turbo
# using LoopVectorization: matmul_params
# macro turbo(ex); esc(ex); end; macro turbo(ex0, ex1); esc(ex1); end


Expand All @@ -59,8 +61,8 @@ export SimpleChain,
L2Penalty,
FrontLastPenalty

include("utils.jl")
include("simple_chain.jl")
include("utils.jl")
include("activation.jl")
include("dense.jl")
include("dropout.jl")
Expand Down
15 changes: 15 additions & 0 deletions src/conv.jl
Original file line number Diff line number Diff line change
Expand Up @@ -856,6 +856,21 @@ function valgrad_layer!(pg::Ptr{T}, c::Conv, A, inds, p::Ptr{T}, pu::Ptr{UInt8})
Ptr{UInt8}(pu3),
)
end
function chain_valgrad_entry!(
pg,
arg,
layers::Tuple{Conv,X,Vararg},
inds,
p::Ptr,
pu::Ptr{UInt8},
) where {X}
l = getfield(layers, 1)
pg2, larg, p2, pu2 = valgrad_layer!(pg, l, arg, inds, p, pu)
val, grad, _ = chain_valgrad!(pg2, larg, Base.tail(layers), p2, pu2)
pullback_param!(pg, l, grad, arg, p, pu)
return val
end


function valgrad_layer!(
pg::Ptr{T},
Expand Down
Loading

9 comments on commit 913cee0

@chriselrod
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error while trying to register: Register Failed
@chriselrod, it looks like you are not a publicly listed member/owner in the parent organization (PumasAI).
If you are a member/owner, you will need to change your membership to public. See GitHub Help

@ChrisRackauckas
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register()

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error while trying to register: Changing package repo URL not allowed, please submit a pull request with the URL change to the target registry and retry.

@ChrisRackauckas
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChrisRackauckas
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register()

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/58495

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.2.1 -m "<description of version>" 913cee0d9cc9566e1bbe5a3c6b7d9ddb120184eb
git push origin v0.2.1

Also, note the warning: Version 0.2.1 skips over 0.2.0
This can be safely ignored. However, if you want to fix this you can do so. Call register() again after making the fix. This will update the Pull request.

@ChrisRackauckas
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register()

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request updated: JuliaRegistries/General/58495

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.2.1 -m "<description of version>" 913cee0d9cc9566e1bbe5a3c6b7d9ddb120184eb
git push origin v0.2.1

Please sign in to comment.