**1. The problem statement, all variables and given/known data**

A ball is thrown vertically upward from ground level with an initial velocity V0. The ball rises to a height, h, then lands on the roof of a building of height 1/2h. The entire motion requires 10s. Find the height, h, and the initial velocity, V0.

**2. Relevant equations**

Kinematics.

**3. The attempt at a solution**

I can’t come up with anything, i’m using a final velocity of 0 m/s when running everything through and none of the values are making sense. I just need to know if it’s actually solvable for real values. I’m in a hurry for work right now, but I will check back later.

I’m using: t2: 10s

t0= 0s

v1= 0 m/s (turning point)

a = -g

y1= h

y2= 1/2h

v2= 0 m/s (stop)

y0 = 0 m

http://ift.tt/1gBb9Qx