Reference
Contents
Index
FluxNLPModels.FluxNLPModelFluxNLPModels.FluxNLPModelFluxNLPModels.accuracyFluxNLPModels.minibatch_next_test!FluxNLPModels.minibatch_next_train!FluxNLPModels.reset_minibatch_test!FluxNLPModels.reset_minibatch_train!FluxNLPModels.set_vars!NLPModels.grad!NLPModels.objNLPModels.objgrad!
FluxNLPModels.FluxNLPModel — TypeFluxNLPModel{T, S, C <: Flux.Chain} <: AbstractNLPModel{T, S}Data structure that makes the interfaces between neural networks defined with Flux.jl and NLPModels. A FluxNLPModel has fields
Arguments
metaandcountersretain informations about theFluxNLPModel;chainis the chained structure representing the neural network;data_trainis the complete data training set;data_testis the complete data test;size_minibatchparametrizes the size of an training and test minibatchestraining_minibatch_iteratoris an iterator over an training minibatches;test_minibatch_iteratoris an iterator over the test minibatches;current_training_minibatchis the training minibatch used to evaluate the neural network;current_minibatch_testis the current test minibatch, it is not used in practice;wis the vector of weights/variables;
FluxNLPModels.FluxNLPModel — MethodFluxNLPModel(chain_ANN data_train=MLDatasets.MNIST.traindata(Float32), data_test=MLDatasets.MNIST.testdata(Float32); size_minibatch=100)Build a FluxNLPModel from the neural network represented by chain_ANN. chain_ANN is built using Flux.jl for more details. The other data required are: an iterator over the training dataset data_train, an iterator over the test dataset data_test and the size of the minibatch size_minibatch. Suppose (xtrn,ytrn) = Fluxnlp.data_train
FluxNLPModels.accuracy — Methodaccuracy(nlp::AbstractFluxNLPModel)Compute the accuracy of the network nlp.chain on the entire test dataset. data_loader can be overwritten to include other data, device is set to cpu
FluxNLPModels.minibatch_next_test! — Methodminibatch_next_test!(nlp::AbstractFluxNLPModel)Selects the next minibatch from nlp.test_minibatch_iterator. Returns the new current status of the iterator nlp.current_test_minibatch. minibatch_next_test! aims to be used in a loop or method call. if return false, it means that it reach the end of the mini-batch
FluxNLPModels.minibatch_next_train! — Methodminibatch_next_train!(nlp::AbstractFluxNLPModel)Selects the next minibatch from nlp.training_minibatch_iterator. Returns the new current status of the iterator nlp.current_training_minibatch. minibatch_next_train! aims to be used in a loop or method call. if return false, it means that it reach the end of the mini-batch
FluxNLPModels.reset_minibatch_test! — Methodreset_minibatch_test!(nlp::AbstractFluxNLPModel)If a data_loader (an iterator object is passed to FluxNLPModel) then Select the first test minibatch for nlp.
FluxNLPModels.reset_minibatch_train! — Methodreset_minibatch_train!(nlp::AbstractFluxNLPModel)If a data_loader (an iterator object is passed to FluxNLPModel) then Select the first training minibatch for nlp.
FluxNLPModels.set_vars! — Methodset_vars!(model::AbstractFluxNLPModel{T,S}, new_w::AbstractVector{T}) where {T<:Number, S}Sets the vaiables and rebuild the chain
NLPModels.grad! — Methodg = grad!(nlp, w, g)Evaluate ∇f(w), the gradient of the objective function at w in place.
Arguments
nlp::AbstractFluxNLPModel{T, S}: the FluxNLPModel data struct;w::AbstractVector{T}: is the vector of weights/variables;g::AbstractVector{T}: the gradient vector.
Output
g: the gradient at pointw.
NLPModels.obj — Methodf = obj(nlp, w)Evaluate f(w), the objective function of nlp at w.
Arguments
nlp::AbstractFluxNLPModel{T, S}: the FluxNLPModel data struct;w::AbstractVector{T}: is the vector of weights/variables.
Output
f_w: the new objective function.
NLPModels.objgrad! — Methodobjgrad!(nlp, w, g)Evaluate both f(w), the objective function of nlp at w, and ∇f(w), the gradient of the objective function at w in place.
Arguments
nlp::AbstractFluxNLPModel{T, S}: the FluxNLPModel data struct;w::AbstractVector{T}: is the vector of weights/variables;g::AbstractVector{T}: the gradient vector.
Output
f_w,g: the new objective function, and the gradient at point w.