NEWS
pnd 0.0.9
- Fix: fixed a regression with the default step size
- Fix: parallelised Hessians in the same manner as gradients
- Feature: compatibility of
Hessian()
with the arguments for methods "Richardson"
and "simple"
from numDeriv
pnd 0.0.8 (2025-03-06)
- Fix: sped up CPU core request diagnostics for 1-core operations
- Fix: Using full paths on Macs
pnd 0.0.7 (2025-03-01)
- Fix: removed obsolete environment creation for cluster export
- Fix: changed physical core detection on Macs
- Misc: the package has been added to CRAN, fewer syntax changes are expected
pnd 0.0.6 (2025-02-25)
- Fix: Derivatives of vectorised functions are working. Example:
Grad(sin, 1:4)
- Feature: Auto-detecting the number of cores available on multi-core machines to speed up computations
- Feature: Added plug-in step size selection with an estimated
f'''
with a rule of thumb
- Feature: Auto-detection of parallel type
- Feature: Added zero tolerance to the default step for a fixed step
pnd 0.0.5
- Feature: Extended the step-selection routines to gradients (vector input
x
)
- Feature: Parallelisation of step selection in all algorithms
- Feature: Mathur's AutoDX algorithm for step size selection
step.M()
- Feature: Added
Hessian()
that supports central differences (for the moment) and arbitrary accuracy
- Feature: Separate
Grad()
and Jacobian()
that call the workhorse, GenD()
, for compatibility with numDeriv
pnd 0.0.4
- Feature: Stepleman--Winarsky algorithm for step size selection
step.SW()
- Feature: Automated wrapper for step size selection
gradstep()
- Improvement: Safe handling of function errors and non-finite returns in step-size selection procedures
- Improvement: Finite-difference coefficients gained attributes: Taylor expansion, coefficient on the largest truncated term, and effective accuracy (useful for custom stencils)
- Improvement: added unit tests for core features
pnd 0.0.3
- Feature:
solveVandermonde()
to solve ill-conditioned problems that arise in weight calculation
- Feature: Dumontet--Vignes algorithm for step size selection
step.DV()
- Feature: Curtis--Reid algorithm for step size selection
step.CR()
and its modification
- Feature: Different step sizes for the gradient
- Fix: If the user supplies a short custom stencil and requests a high accuracy order, it will provide the best available order and produce a warning
- Fix: The output of
Grad()
preserves the names of x
and FUN(x)
, which prevents errors in cases where names are required
pnd 0.0.2
- Fix: bug in stencil calculation
pnd 0.0.1
- Initial release
- Computing finite-difference coefficients on arbitrary stencils
- Computing numerical gradients with reasonable default step sizes
- Numerical Jacobians
- Support for
mclapply()
on *nix systems only