v0.13.1
github-actions
released this
10 May 21:48
·
568 commits
to master
since this release
Flux v0.13.1
Closed issues:
- Batchnorm on GPU for Float64 values (#1897)
- Tag? (#1924)
- DataLoader causes scalar indexing on GPU in Flux v0.13.0 (regression) (#1935)
- Flux.flip with broadcasting warning (#1936)
- Add a workflow to clean-up
gh-pages
branch? (#1940) - DimensionMismatch: All data containers must have the same number of observations. (#1941)
- Type instability in Recur for 3 dimensional arrays (#1947)
- What is the idiomatic way to get training loss from
gradient()
? (#1950) - Dropout erroring on latest CUDA (#1960)
- AdaBelief issues (#1962)
Merged pull requests:
- Add a ton of doctests + fix outdated documentation in
.md
files (#1916) (@Saransh-cpp) - Get the DocBot up again! (#1937) (@Saransh-cpp)
- Broadcasting replaced with comprehension in the Flux.flip function. (#1938) (@fpartl)
- Fix type instabilities in apply!(optimizer, ...) (#1942) (@ancapdev)
- Add a workflow to delete PR previews (#1943) (@Saransh-cpp)
- Fix for progress logging to non-VS Code loggers (#1944) (@darsnack)
- Add Base.firstindex(c::Chain) = 1 (#1945) (@KronosTheLate)
- Recur type stability for 3d arrays (#1948) (@Marcovela)
- Resolve two warnings in the test suite (#1951) (@mcognetta)
- Update documentation on Split layer (#1953) (@JLDC)
- [docs] suggest using ADAM with LR=1 when combined with ExpDecay (#1955) (@ericphanson)
- Type stable
conv_reshape_bias
and AD-friendlyConvDims
helpers (#1956) (@ToucheSir) - onehotbatch with CuArray (#1959) (@CarloLucibello)
- AdaBelief bias correction (#1963) (@cossio)