Science Wiki
Register
Advertisement

Διαφορικόν

Differential


Mathematicians-Newton-Lieibniz-01-goog

Ολοκληρωτικός Λογισμός
Διαφορικός Λογισμός
Isaac Newton
Gottfried Leibniz
Παράγωγος
Διαφορικό
(Ακριβέστερα,
αρχικά ο Newton συμβόλισε την παράγωγο με τελεία
και μετά ο Lagrange, ακολουθώντας το ίδιο πνεύμα,
την αντικατέστησε με τόνο)

- Ένα Μαθηματικό Δόμημα.

Ετυμολογία[]

Η ονομασία "Διαφορικό" σχετίζεται ετυμολογικά με την λέξη "Διαφορά".

Εισαγωγή[]

In calculus, the differential represents the principal part of the change in a function y = ƒ(x) with respect to changes in the independent variable. The differential dy is defined by

where is the derivative of ƒ with respect to x, and dx is an additional real variable (so that dy is a function of x and dx). The notation is such that the equation

holds, where the derivative is represented in the Leibniz notation dy/dx, and this is consistent with regarding the derivative as the quotient of the differentials. One also writes

The precise meaning of the variables dy and dx depends on the context of the application and the required level of mathematical rigor. The domain of these variables may take on a particular geometrical significance if the differential is regarded as a particular differential form, or analytical significance if the differential is regarded as a linear approximation to the increment of a function. In physical applications, the variables dx and dy are often constrained to be very small ("infinitesimal").

History and usage[]

The differential was first introduced via an intuitive or heuristic definition by Gottfried Wilhelm Leibniz, who thought of the differential dy as an infinitely small (or infinitesimal) change in the value y of the function, corresponding to an infinitely small change dx in the function's argument x. For that reason, the instantaneous rate of change of y with respect to x, which is the value of the derivative of the function, is denoted by the fraction

in what is called the Leibniz notation for derivatives. The quotient dy/dx is not infinitely small; rather it is a real number.

The use of infinitesimals in this form was widely criticized, for instance by the famous pamphlet The Analyst by Bishop Berkeley. Augustin-Louis Cauchy (1823) defined the differential without appeal to the atomism of Leibniz's infinitesimals.[1][2] Instead, Cauchy, following d'Alembert, inverted the logical order of Leibniz and his successors: the derivative itself became the fundamental object, defined as a limit of difference quotients, and the differentials were then defined in terms of it. That is, one was free to define the differential dy by an expression

in which dy and dx are simply new variables taking finite real values,[3] not fixed infinitesimals as they had been for Leibniz.[4]

According to Πρότυπο:Harvtxt, Cauchy's approach was a significant logical improvement over the infinitesimal approach of Leibniz because, instead of invoking the metaphysical notion of infinitesimals, the quantities dy and dx could now be manipulated in exactly the same manner as any other real quantities in a meaningful way. Cauchy's overall conceptual approach to differentials remains the standard one in modern analytical treatments,[5] although the final word on rigor, a fully modern notion of the limit, was ultimately due to Karl Weierstrass.[6]

In physical treatments, such as those applied to the theory of thermodynamics, the infinitesimal view still prevails. Πρότυπο:Harvtxt reconcile the physical use of infinitesimal differentials with the mathematical impossibility of them as follows. The differentials represent finite non-zero values that are smaller than the degree of accuracy required for the particular purpose for which they are intended. Thus "physical infinitesimals" need not appeal to a corresponding mathematical infinitesimal in order to have a precise sense.

Following twentieth-century developments in mathematical analysis and differential geometry, it became clear that the notion of the differential of a function could be extended in a variety of ways. In real analysis, it is more desirable to deal directly with the differential as the principal part of the increment of a function. This leads directly to the notion that the differential of a function at a point is linear functional of an increment Δx. This approach allows the differential (as a linear map) to be developed for a variety of more sophisticated spaces, ultimately giving rise to such notions as the Fréchet or Gâteaux derivative. Likewise, in differential geometry, the differential of a function at a point is a linear function of a tangent vector (an "infinitely small displacement"), which exhibits it as a kind of one-form: the exterior derivative of the function. In non-standard calculus, differentials are regarded as infinitesimals, which can themselves be put on a rigorous footing (see differential (infinitesimal)).

Definition[]

Αρχείο:Sentido geometrico del diferencial de una funcion.png

The differential of a function ƒ(x) at a point x0.

The differential is defined in modern treatments of differential calculus as follows.[7] The differential of a function ƒ(x) of a single real variable x is the function df of two independent real variables x and Δx given by

One or both of the arguments may be suppressed, i.e., one may see df(x) or simply df. If y = ƒ(x), the differential may also be written as dy. Since dx(x, Δx) = Δx it is conventional to write dx = Δx, so that the following equality holds:

This notion of differential is broadly applicable when a linear approximation to a function is sought, in which the value of the increment Δx is small enough. More precisely, if ƒ is a differentiable function at x, then the difference in y-values

satisfies

where the error ε in the approximation satisfies ε/Δx → 0 as Δx → 0. In other words, one has the approximate identity

in which the error can be made as small as desired relative to Δx by constraining Δx to be sufficiently small; that is to say,

as Δx → 0. For this reason, the differential of a function is known as the principal (linear) part in the increment of a function: the differential is a linear function of the increment Δx, and although the error ε may be nonlinear, it tends to zero rapidly as Δx tends to zero.

Differentials in several variables[]

Following Πρότυπο:Harvtxt, for functions of more than one independent variable,

the partial differential of y with respect to any one of the variables x1 is the principal part of the change in y resulting from a change dx1 in that one variable. The partial differential is therefore

involving the partial derivative of y with respect to x1. The sum of the partial differentials with respect to all of the independent variables is the total differential

which is the principal part of the change in y resulting from changes in the independent variables x i.

More precisely, in the context of multivariable calculus, following Πρότυπο:Harvtxt, if ƒ is a differentiable function, then by the definition of the differentiability, the increment

where the error terms ε i tend to zero as the increments Δxi jointly tend to zero. The total differential is then rigorously defined as

Since, with this definition,

one has

As in the case of one variable, the approximate identity holds

in which the total error can be made as small as desired relative to by confining attention to sufficiently small increments.

Higher-order differentials[]

Higher-order differentials of a function y = ƒ(x) of a single variable x can be defined via:[8]

and, in general,

Informally, this justifies Leibniz's notation for higher-order derivatives

When the independent variable x itself is permitted to depend on other variables, then the expression becomes more complicated, as it must include also higher order differentials in x itself. Thus, for instance,

and so forth.

Similar considerations apply to defining higher order differentials of functions of several variables. For example, if ƒ is a function of two variables x and y, then

where is a binomial coefficient. In more variables, an analogous expression holds, but with an appropriate multinomial expansion rather than binomial expansion.[9]

Higher order differentials in several variables also become more complicated when the independent variables are themselves allowed to depend on other variables. For instance, for a function ƒ of x and y which are allowed to depend on auxiliary variables, one has

Because of this notational infelicity, the use of higher order differentials was roundly criticized by Πρότυπο:Harvnb, who concluded:

Enfin, que signifie ou que représente l'égalité
A mon avis, rien du tout.

That is: Finally, what is meant, or represented, by the equality [...]? In my opinion, nothing at all. In spite of this skepticism, higher order differentials did emerge as an important tool in analysis[10]

In these contexts, the nth order differential of the function ƒ applied to an increment Δx is defined by

or an equivalent expression, such as

where is an nth forward difference with increment tΔx.

This definition makes sense as well if ƒ is a function of several variables (for simplicity taken here as a vector argument). Then the nth differential defined in this way is a homogeneous function of degree n in the vector increment Δx. Furthermore, the Taylor series of ƒ at the point x is given by

The higher order Gâteaux derivative generalizes these considerations to infinite dimensional spaces.

Properties[]

A number of properties of the differential follow in a straightforward manner from the corresponding properties of the derivative, partial derivative, and total derivative. These include:[11]

  • Linearity: For constants a and b and differentiable functions ƒ and g,
  • Product rule: For two differentiable functions ƒ and g,


An operation d with these two properties is known in abstract algebra as a derivation. They imply the Power rule

In addition, various forms of the chain rule hold, in increasing level of generality:[12]

  • If y = ƒ(u) is a differentiable function of the variable u and u = g(x) is a differentiable function of x, then
  • If y = ƒ(x1, ..., xn) and all of the variables x1, ..., xn depend on another variable t, then by the chain rule for partial derivatives, one has
Heuristically, the chain rule for several variables can itself be understood by dividing through both sides of this equation by the infinitely small quantity dt.
  • More general analogous expressions hold, in which the intermediate variables x i depend on more than one variable.

General formulation[]

Πρότυπο:See also A consistent notion of differential can be developed for a function ƒ Rn → Rm between two Euclidean spaces. Let xx ∈ Rn be a pair of Euclidean vectors. The increment in the function ƒ is

If there exists an m × n matrix A such that

in which the vector ε → 0 as Δx → 0, then ƒ is by definition differentiable at the point x. The matrix A is sometimes known as the Jacobian matrix, and the linear transformation that associates to the increment Δx ∈ Rn the vector AΔx ∈ Rm is, in this general setting, known as the differential (x) of ƒ at the point x. This is precisely the Fréchet derivative, and the same construction can be made to work for a function between any Banach spaces.

Another fruitful point of view is to define the differential directly as a kind of directional derivative:

which is the approach already taken for defining higher order differentials (and is most nearly the definition set forth by Cauchy). If t represents time and x position, then h represents a velocity instead of a displacement as we have heretofore regarded it. This yields yet another refinement of the notion of differential: that it should be a linear function of a kinematic velocity. The set of all velocities through a given point of space is known as the tangent space, and so gives a linear function on the tangent space: a differential form. With this interpretation, the differential of ƒ is known as the exterior derivative, and has broad application in differential geometry because the notion of velocities and the tangent space makes sense on any differentiable manifold. If, in addition, the output value of ƒ also represents a position (in a Euclidean space), then a dimensional analysis confirms that the output value of must be a velocity. If one treats the differential in this manner, then it is known as the pushforward since it "pushes" velocities from a source space into velocities in a target space.

Other approaches[]

Πρότυπο:Main Although the notion of having an infinitesimal increment dx is not well-defined in modern mathematical analysis, a variety of techniques exist for defining the infinitesimal differential so that the differential of a function can be handled in a manner that does not clash with the Leibniz notation. These include:

  • Defining the differential as a kind of differential form, specifically the exterior derivative of a function. The infinitesimal increments are then identified with vectors in the tangent space at a point. This approach is popular in differential geometry and related fields, because it readily generalizes to mappings between differentiable manifolds.
  • Differentials as nilpotent elements of commutative rings. This approach is popular in algebraic geometry.[13]
  • Differentials in smooth models of set theory. This approach is known as synthetic differential geometry or smooth infinitesimal analysis and is closely related to the algebraic geometric approach, except that ideas from topos theory are used to hide the mechanisms by which nilpotent infinitesimals are introduced.[14]
  • Differentials as infinitesimals in hyperreal number systems, which are extensions of the real numbers which contain invertible infinitesimals and infinitely large numbers. This is the approach of nonstandard analysis pioneered by Abraham Robinson.[15]

Examples and applications[]

Differentials may be effectively used in numerical analysis to study the propagation of experimental errors in a calculation, and thus the overall numerical stability of a problem Πρότυπο:Harv. Suppose that the variable x represents the outcome of an experiment and y is the result of a numerical computation applied to x. The question is to what extent errors in the measurement of x influence the outcome of the computation of y. If the x is known to within Δx of its true value, then Taylor's theorem gives the following estimate on the error Δy in the computation of y:

where ξ = x + θΔx for some 0 < θ < 1. If Δx is small, then the second order term is negligible, so that Δy is, for practical purposes, well-approximated by dy = ƒ'(xx.

The differential is often useful to rewrite a differential equation

in the form

in particular when one wants to separate the variables.

Υποσημειώσεις[]

  1. For a detailed historical account of the differential, see Πρότυπο:Harvnb, especially page 275 for Cauchy's contribution on the subject. An abbreviated account appears in Πρότυπο:Harvnb.
  2. Cauchy explicitly denied the possibility of actual infinitesimal and infinite quantities Πρότυπο:Harv, and took the radically different point of view that "a variable quantity becomes infinitely small when its numerical value decreases indefinitely in such a way as to converge to zero" (Πρότυπο:Harvnb; translation from Πρότυπο:Harvnb).
  3. Πρότυπο:Harvnb
  4. Πρότυπο:Harvnb: "The differentials as thus defined are only new variables, and not fixed infinitesimals..."
  5. Πρότυπο:Harvnb: "Here we remark merely in passing that it is possible to use this approximate representation of the increment Δy by the linear expression (x) to construct a logically satisfactory definition of a "differential", as was done by Cauchy in particular."
  6. Πρότυπο:Harvnb
  7. See, for instance, the influential treatises of Πρότυπο:Harvnb, Πρότυπο:Harvnb, Πρότυπο:Harvnb, and Πρότυπο:Harvnb. Tertiary sources for this definition include also Πρότυπο:Harvnb and Πρότυπο:Harvnb.
  8. Πρότυπο:Harvnb. See also, for instance, Πρότυπο:Harvnb.
  9. Πρότυπο:Harvnb
  10. In particular to infinite dimensional holomorphy Πρότυπο:Harv and numerical analysis via the calculus of finite differences.
  11. Πρότυπο:Harvnb
  12. Πρότυπο:Harvnb
  13. Πρότυπο:Harvnb.
  14. See Πρότυπο:Harvnb and Πρότυπο:Harvnb.
  15. See Πρότυπο:Harvnb and Πρότυπο:Harvnb.

Εσωτερική Αρθρογραφία[]

Βιβλιογραφία[]

Ιστογραφία[]


Ikl Κίνδυνοι ΧρήσηςIkl

Αν και θα βρείτε εξακριβωμένες πληροφορίες
σε αυτήν την εγκυκλοπαίδεια
ωστόσο, παρακαλούμε να λάβετε σοβαρά υπ' όψη ότι
η "Sciencepedia" δεν μπορεί να εγγυηθεί, από καμιά άποψη,
την εγκυρότητα των πληροφοριών που περιλαμβάνει.

"Οι πληροφορίες αυτές μπορεί πρόσφατα
να έχουν αλλοιωθεί, βανδαλισθεί ή μεταβληθεί από κάποιο άτομο,
η άποψη του οποίου δεν συνάδει με το "επίπεδο γνώσης"
του ιδιαίτερου γνωστικού τομέα που σας ενδιαφέρει."

Πρέπει να λάβετε υπ' όψη ότι
όλα τα άρθρα μπορεί να είναι ακριβή, γενικώς,
και για μακρά χρονική περίοδο,
αλλά να υποστούν κάποιο βανδαλισμό ή ακατάλληλη επεξεργασία,
ελάχιστο χρονικό διάστημα, πριν τα δείτε.



Επίσης,
Οι διάφοροι "Εξωτερικοί Σύνδεσμοι (Links)"
(όχι μόνον, της Sciencepedia
αλλά και κάθε διαδικτυακού ιστότοπου (ή αλλιώς site)),
αν και άκρως απαραίτητοι,
είναι αδύνατον να ελεγχθούν
(λόγω της ρευστής φύσης του Web),
και επομένως είναι ενδεχόμενο να οδηγήσουν
σε παραπλανητικό, κακόβουλο ή άσεμνο περιεχόμενο.
Ο αναγνώστης πρέπει να είναι
εξαιρετικά προσεκτικός όταν τους χρησιμοποιεί.

- Μην κάνετε χρήση του περιεχομένου της παρούσας εγκυκλοπαίδειας
αν διαφωνείτε με όσα αναγράφονται σε αυτήν

IonnKorr-System-00-goog



>>Διαμαρτυρία προς την wikia<<

- Όχι, στις διαφημίσεις που περιέχουν απαράδεκτο περιεχόμενο (άσεμνες εικόνες, ροζ αγγελίες κλπ.)


Advertisement