Partial Derivative & Gradient Calculator: Multivariable Calculus

Partial Derivative and Gradient Calculator

Partial Derivative & Gradient Calculator

Compute ∇f, partial derivatives, and visualize steepest ascent.

Variables: x, y, z. Supported: sin, cos, exp, ^, etc.

X
Y
Z

Gradient Visualization (z = f(x,y))

Drag to Rotate
Surface z = f(x,y)
Gradient Vector (Direction)

Visualization Note: Plotting f(x,y) only. Z-input is used for derivative calculation but ignored for graph height.

Partial Derivatives & The Gradient

In single-variable calculus, the derivative tells us the slope of a curve. In the 3D world of Multivariable Calculus, functions change in multiple directions at once. To handle this, we use Partial Derivatives and the Gradient Vector.

1. Partial Derivatives (∂)

A partial derivative measures how the function changes if you move in only one direction while keeping all other variables constant.

  • ∂f/∂x: Slope in the x-direction (y is constant).
  • ∂f/∂y: Slope in the y-direction (x is constant).

2. The Gradient (∇f)

The gradient is a vector that packages all partial derivatives together: ∇f = < ∂f/∂x, ∂f/∂y >.

Key Property: The gradient always points in the direction of steepest ascent (fastest increase). Its magnitude tells you how steep that slope is.

How to Calculate

  1. Differentiate wrt X: Treat ‘y’ and ‘z’ like regular numbers (constants). E.g., derivative of x²y with respect to x is 2xy.
  2. Differentiate wrt Y: Treat ‘x’ and ‘z’ like constants. E.g., derivative of x²y with respect to y is x².
  3. Assemble: Put these results into a vector.
  4. Evaluate: Plug in your specific point (x, y, z) to get the final numerical vector.

Frequently Asked Questions (FAQ)

Q: What if the Gradient is zero < 0, 0 >?

A: This means you are at a “Critical Point”—usually a peak (maximum), a valley (minimum), or a saddle point. It’s like standing on flat ground on top of a hill.

Q: Why is the vector perpendicular to level curves?

A: Since the gradient points in the direction of fastest change, it must be perpendicular to the direction of no change (which is the level curve or contour line).

Q: Can I do this for 4 or more variables?

A: Absolutely! The math is exactly the same. For a function with 100 variables, the gradient is just a vector with 100 components. This is actually how Machine Learning works (Gradient Descent)!

HIGHER SCHOOL