- Initial implementation of the robust hat matrix regression estimator
- Add more test to robust hat matrix regression estimator
- Introduce
view
in LTS and LMS.
- More explicit return types, drop
Dict
withDict{String, Any}
orDict{String, Vector}
- Add
Julia v.1.10
to GitHub actions - Initial attempt to reduce memory allocations in
lts()
,lms()
,hadi92()
,hadi94()
,hs93()
,robcov()
- Replace
@assert
macro withthrow(ErrorException())
in whole code depestregression
returnsDict
instead of avector
of betas like other regression methods.summary()
throwsErrorException
rather than simply prompting with@error
macro.robcov
doesn't use try and catch any more.- Replace
sortperm
withsortperm!
in mve. - Set number of iterations to minimum([5 * p, 3000]) of LTS as in the R implementation. This reduces the time required 3x.
- Minor fixes
- Optional eps and maxiter parameters in iterateCSteps() in LTS.
- Replace
ols(X[indices, :], y[indices])
withX[...] \ y[...]
in highly computational ones such like LTS, LMS, and LTA.
- Concrete types of X and y changed to AbstractMatrix{Float64} and AbstractVector{Float64}
- Change function signatures from ::Function to ::F where {F <: Function}
- Increase test coverage
- Deepest Regression Estimator added.
- mahalanobisSquaredBetweenPairs() return Union{Nothing, Matrix} depending on the determinant of the covariance matrix
- mahalanobisSquaredMatrix() returns Union{Nothing, Matrix} depending on the determinant of the covariance matrix
- import in DataImages fixed.
- Array{Float64, 1} is replaced by Vector{Float64}.
- Array{Float64, 2} is replaced by Matrix{Float64}.
- Use of try/catch reduced, many tries were depending on singularities.
- Update compatibility with Clustering and StatsModels
- LMS returns betas rather than coefs
- PrecompileTools integration for faster loading of package
- Replace RGB{} to RGBX{} in plots
- Adopt to SnoopPrecompile for better first-time-use experiment
# v0.9.4
- Convergency check for hs93()
- Add earlystop option for inexact version of lta().
- Remove cga() and cga based experimental algorithm.
- Fix deadlock in bacon.
- ga() is now faster.
- Implement dfbetas().
- Separate implementations of dffit() and dffits()
- Implement diagnose()
- Replace GLPK with HiGHS. When n > 10000, HiGHS based lad() is now approximately 6x faster.
- asm2000(), imon2005(), ks89(), bacon(), and smr98() now return regression coefficients calculated using the clean set of observations.
- lts() has a new optional parameter called earlystop and it is true by default. If the objective function does not change in a predefined number of iterations, the search is stopped.
- py95() returns vector of estimated regression coefficients using the clean observations.
- Increase test coverage
- Add ols(setting) and wls(setting, weights = w) type method call where setting is a regression setting
- Implement cooksoutliers() method for determining potential regression outliers using a cutoff value.
- Update documentation
- Implement Theil-Sen estimator for multiple regression
- Fix bchplot dependencies
- Update README with new instructions
- Satman2015 now returns more verbose output
- Add exact argument for LAD. If exact is true then the linear programming based exact solution is found. Otherwise, a GA based search is performed to yield approximate solutions.
- Remove dependency of Plots.jl. If Plots.jl is installed and loaded manually, the functionality that uses Plot is autmatically loaded by Requires.jl. Affected functions are
dataimage
,mveltsplot
, andbchplot
.
- Update Satman(2013) algorithm
- Add docs for Satman's (modified) GA based LTS estimation (2012)
- Remove dependency of StatsBase
- Quantile Regression implemented
- Modularize dataimage()
- Grouped tests
- x === nothing style decisions replaces by isnothing(x)
- Update documentation
- Modularization
- Removed some unused variables
- Refactor code
- Update docs system
- Dependency entries updated in Project.toml
- asyncmap replaced with map in lta
- JuMP version increased
- Julia compatibality level is now 1.7
- Update JuMP and GLPK
- LAD (Least Absolute Deviations) is now exact and uses a linear programming based model
- Dependencies for JuMP and GLPK are added
- Dependency for Optim removed