**1. The problem statement, all variables and given/known data**

An object thrown up from a cliff 10 m/s reaches a velocity of 20 m/s [down] as it lands. If acceleration due to gravity is 9.8 m/s

^{2}, what is the object’s displacement? How long did it take for the object to land from the time it was thrown?

**2. Relevant equations**

I’m not sure if all of the below are relevant, or perhaps some equations are missing.

Δx = v_{0}t + 1/2at^{2}

(displacement = initial velocity x time + 1/2 acceleration x time squared)

v_{f}^{2} = v_{0}^{2} + 2aΔx

(final velocity squared = initial velocity squared + 2 acceleration x displacement

**3. The attempt at a solution**

t_{1} = v/a = 1.02 s

t_{2} = v/a = 2.04 s

t_{1} + t_{2} = 3.1 s

Not sure how to find the displacement, but the answer key says it’s -15m. How do I find it?

http://ift.tt/1e84iOa