Web spiders updating dynamic pages
That snapshot should be identical to the end result a user gets to see his browser, but in pure HTML form. As such, it will happily index your snapshot and expose the clean URL in search results.
The Java Script processing capabilities of Googlebot have evolved quite a bit in recent years.
However, Google is not clear about which features are supported and which are not.
Letting crawlers do all the heavy lifting also impacts their crawling rate since they spend much more time on each page.
In the beginning of the Internet, all HTML pages that were sent to clients were assembled server side.
It was in that period that Google first came up with their idea of a search engine.
This approach however, is a form of cloaking which is considered a bad practice in SEO land. There is however a much better way to cope with this problem: use a prerenderer.
A prerenderer is a tool that runs through your whole site and executes Java Script, produces the resulting HTML and makes static versions of those pages.
For a fair share of public facing websites there was a discrepancy between what the crawler saw and the end result in the user’s browser.
Once that’s done, it will cache those HTML snapshots and serve them to Google or Bing when requested. It lets you optimize exactly what is seen by the crawler and makes sure everything you want to be discovered by search engines is easily found.