Suppose f,f' and ƒ" € L¹, ƒ and ƒ' are absolutely continuous on any bounded subset of R¹, and
limx→±[infinity]f’(x) = 0 = limx→±[infinity]f'(x). Show that if g = f", then ĝ(t) = −t²f(t).