A model rocket is launced from the ground with an initial velocity of 65 m/s directly upwards. If a horizontal wind is blowing at 5 m/s during the flight, how far from the launch point will the rocket land?

I know I should probably be using the equation x=x0+v0t+1/2at^2, but i am having a really hard time getting started on the problem.

You're on the right track. Note that x=0 when the rocket lands, solve for t and then x = vt to see how far it drifted.