Gradient #
Main Definitions #
Let f be a function from a Hilbert Space F to 𝕜 (𝕜 is ℝ or ℂ) , x be a point in F
and f' be a vector in F. Then
HasGradientWithinAt f f' s x
says that f has a gradient f' at x, where the domain of interest
is restricted to s. We also have
HasGradientAt f f' x := HasGradientWithinAt f f' x univ
Main results #
This file contains the following parts of gradient.
- the definition of gradient.
- the theorems translating between
HasGradientAtFilterandHasFDerivAtFilter,HasGradientWithinAtandHasFDerivWithinAt,HasGradientAtandHasFDerivAt,Gradientandfderiv. - theorems the Uniqueness of Gradient.
- the theorems translating between
HasGradientAtFilterandHasDerivAtFilter,HasGradientAtandHasDerivAt,GradientandderivwhenF = 𝕜. - the theorems about the congruence of the gradient.
- the theorems about the gradient of constant function.
- the theorems about the continuity of a function admitting a gradient.
A function f has the gradient f' as derivative along the filter L if
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) when x' converges along the filter L.
Equations
- HasGradientAtFilter f f' x L = HasFDerivAtFilter f ((InnerProductSpace.toDual 𝕜 F) f') x L
Instances For
f has the gradient f' at the point x within the subset s if
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) where x' converges to x inside s.
Equations
- HasGradientWithinAt f f' s x = HasGradientAtFilter f f' x (nhdsWithin x s)
Instances For
f has the gradient f' at the point x if
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) where x' converges to x.
Equations
- HasGradientAt f f' x = HasGradientAtFilter f f' x (nhds x)
Instances For
Gradient of f at the point x within the set s, if it exists. Zero otherwise.
If the derivative exists (i.e., ∃ f', HasGradientWithinAt f f' s x), then
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) where x' converges to x inside s.
Equations
- gradientWithin f s x = (InnerProductSpace.toDual 𝕜 F).symm (fderivWithin 𝕜 f s x)
Instances For
Gradient of f at the point x, if it exists. Zero otherwise.
Denoted as ∇ within the Gradient namespace.
If the derivative exists (i.e., ∃ f', HasGradientAt f f' x), then
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) where x' converges to x.
Equations
- gradient f x = (InnerProductSpace.toDual 𝕜 F).symm (fderiv 𝕜 f x)
Instances For
Gradient of f at the point x, if it exists. Zero otherwise.
Denoted as ∇ within the Gradient namespace.
If the derivative exists (i.e., ∃ f', HasGradientAt f f' x), then
f x' = f x + ⟨f', x' - x⟩ + o (x' - x) where x' converges to x.
Equations
- Gradient.«term∇» = Lean.ParserDescr.node `Gradient.«term∇» 1024 (Lean.ParserDescr.symbol "∇")
Instances For
Alias of the forward direction of hasGradientWithinAt_iff_hasFDerivWithinAt.
Alias of the forward direction of hasFDerivWithinAt_iff_hasGradientWithinAt.
Alias of the forward direction of hasGradientAt_iff_hasFDerivAt.
Alias of the forward direction of hasFDerivAt_iff_hasGradientAt.