Optimization and approximation

Optimization and approximation are two closely related topics. Generally, in optimization, a real-valued function f (x), called objective function, is minimized. In approximation, a function f (x) is approximated by another function , usually a simpler one. We must have a measurement for the closeness of the approximation. The measurement is usually a real valued function φ(x). We find a " best " approximation by minimizing this function in some sense. 1 Continuous Optimization We study some general methods for solving the problem: min x∈S f (x) f (x): objective function, single variable and real-valued S: support Golden section search Assumption: f (x) has a unique global minimum in [a, b]. • If x * is the minimizer, then f (x) monotonically decreases in [a, x * ] and monotonically increases in [x * , b], for otherwise we would have local minimums. Choose interior points c and d: c = a + r(b − a) d = a + (1 − r)(b − a), 0 < r < 0.5 if f (c) ≤ f (d) b = d else a = c end