Are you really sure that someday the web services aren't going to start returning a lot more results? What if someday there is a bug in one of them where it accidentally returns 50,000 copies of the same result to you? In each of your solutions:
A larger than expected number of results would cause you to spam the web services with repeated requests for the same results, as users page through them.
A larger than expected number of results will end up temporarily taking up space in your database. Also, in a web app, how will you know when to clear the cache?
A larger than expected number of results will end up as a huge page in the user's browser, possibly not rendering correctly until the whole thing is downloaded.
I really like option 3. The caching is done at the place where the data is wanted, there are no redundant hits to the web services, and paging will be super fast for the users.
If you're really certain no more than 60-70 results will ever be returned, and/or that your users will never want a really large number of results, you could combine option 3 with a cap on the number of results you will return.
Even in the worst case where the web services return erroneous/unexpected results, you could trim it to the first so many, send them down to the browser, and paginate them there with JavaScript.