I am trying to find out how much time it takes for url to load. For this I use file_get_contents
:
$time_start = microtime(true);
$homepage = file_get_contents('http://www.stackoverflow.com/');
$time_end = microtime(true);
$execution_time = ($time_end - $time_start);
I think this way is incorrect because it wont load any of the javascript sources. I was thinking if I could somehow scan urls source code for javascript sources, then open each javascript source and get the time it took to load, then combine each javascript source loading time with file_get_content
loading time - I would get a somewhat more accurate result than I have now.
I also tried few other methods using curl, javascript, iframes but all of them had problems (unrealistic results or crashes).
Edit: I need to write a script, i know there are many tools available for this (did i mention anywhere im looking for a tool?).
Edit2: Ok, after hours of reasearch i found something that i think might help me - i need to create dom file using http://simplehtmldom.sourceforge.net , then find all javascript elements and get all src from them. I dont know why noone suggested using this earlier.