Your question is a little vague, but PHP can cache dynamically created pages into static ones. A very simplistic system would use output buffering to capture the dynamic contents and write them to a file. You then need to decide how to invalidate this cache (for example, by deleting the static file).
However, you should use caching only when you've encountered a specific performance problem, otherwise it just adds to the complexity of your application needlessly. You can also just cache the underperforming resource (for example database results) instead of the entire page.
If you would elaborate more on what you want to achieve, you'll get a more specific answer.
EDIT:
I've reread your modified question. Your approach is incorrect - a person/web crawler requesting the page could not infer from the contents whether it was dynamically generated or not. It could try to guess from the URL format, but those can be controlled by the application. The page can be bookmarked and indexed by search engines just the same.
Caching to actual static files should be used as a optimization when generating those pages on the fly is too expensive.