James Bern
Postdoctoral Associate
MIT CSAIL
Distributed Robotics Lab
jbern@mit.edu
he / him / his




Research: Soft-Rigid Robot Design, Modeling, and Control

> Undergraduate Research Opportunities

> Google Scholar





Draw Robots: Data-Oriented Introduction to Graphics × Robotics

This is a short course to help undergraduate and early graduate researchers pick up the practical skills needed to do novel research at the intersection of computer graphics and robotics.


click to enlarge sample slides



Optimization Glossary

Notation: E_x = dE / dx, E_xx = d2E / dx2.

Gradient descent: The first order Taylor expansion of the energy E(x + dx) ≈ E(x) + E_x(x) dx is minimized by search direction aligned with dx ← -E_x.

Newton's method: The first order Taylor expansion of the gradient E_x(x + dx) ≈ E_x(x) + E_xx(x) dx is zero for search direction dx ← -inv(E_xx) E_x.

Basic backtracking line search: Given current best guess x and search direction dx, our goal is to find stepsize α such that E(x + αdx) < E(x). First check if α = 1 decreases the objective. If not then try α = 1/2, then 1/4, and so on. Give up after e.g. 15 attempts, either returning some nominal α or terminating the optimization.

Direct sensitivity analysis: Given control u, the resulting state x(u) solves the physical constraint C(u, x) = 0. Taking the total derivative of C(u, x(u)) = 0 yields the linear system C_u + C_x x_u = 0, which can be solved for the sensitivity x_u = -inv(C_x) C_u.