0

There is a SharePoint site called http://mySPSite.com. It normally gets fully downloaded at the client side with images, CSS and JavaScript files in 12 seconds. I want to monitor the complete request using PowerShell such that it simulates the download of all the pages, in exactly the same way as a browser would.

What would be an appropriate way to achieve this?

4

1 回答 1

0

我想你可以使用这个:

http://www.howtogeek.com/124736/stupid-geek-tricks-extract-links-off-any-webpage-using-powershell/

用于访问您网站中的所有链接和图像并使用它们制作一个数组。然后用这个脚本循环下载它们:

$url = "http://website.com/downloads/Iwantthisfile.txt" 
$path = "C:\temp\thisisthefile.txt" 

# 参数([string]$url, [string]$path)

if(!(Split-Path -parent $path) -or !(Test-Path -pathType Container (Split-Path -parent      $path))) { 
  $path = Join-Path $pwd (Split-Path -leaf $path) 
} 

"Downloading [$url]`nSaving at [$path]" 
$client = new-object System.Net.WebClient 
$client.DownloadFile($url, $path) 
#$client.DownloadData($url, $path) 

$path
于 2013-08-09T11:11:30.263 回答