Normally I’d recommend adding a sitemap.xml file to the root directory which specifies an update frequency of daily or twice daily, this should encourage googlebot to crawl it often enough to keep it updated and you can manually submit the sitemap in webmaster tools to ensure it’s indexed. Since unbounce doesn’t yet support sitemap.xml files you will need to do this slightly differently.
In Google webmaster tools there is an option to “fetch as googlebot” you’ll need to choose the “Fetch & Render” option, ensure that there are minimal/no blocks to rendering then when it’s finished there will be a button that says “Submit to Index”, which will allow you to request google to crawl that page and refresh its contents.
Here is a link to Googles own step by step guide on how to accomplish this.
Then you’ll just need to hope that it happens fairly swiftly and they don’t keep you waiting.
Tip: If you have other websites that are frequently crawled by Googlebot then add a link to your landing page and it should get crawled at the same time, adding multiple links from multiple sites will increase your chances of getting it to happen sooner rather than later.
I hope that helps a little,