We're working on a product that changes content on a given web page. Now, what we'd like to do, is have this changed content crawled by Google. The content replacement is triggered by different URLs (either sub domain, html5 pushstate or hashbang^1).
What happens right now is that a user (or a bot) sees the content momentarily (usually for just a fraction of a second), before it gets replaced.
Is it possible to hack the rendering of the browser to change the content before it gets rendered? Would this have a positive effect on Google crawling? Or, does anyone have a better idea besides pushing new pages with pushState?
1 Shoot me #!?$?. But nobody uses it, so it's great for us since we don't control the site the script is running on.
EDIT:
HTML snapshots seem to be a possible solution here, proxying if search engine, evaluating original, sending back content https://developers.google.com/webmasters/ajax-crawling/docs/html-snapshot