0

Thinking about a Windows-hosted build process that will periodically drop files to disk to be replicated to several other Windows Servers in the same datacenter. The other machines would run IIS, and serve those files to the masses.

The total corpus size would be millions of files, 100's of GB of data. It'd have to deal with possible contention on the target servers, latent links e.g. over a WAN, cold-start clean servers

Solutions I've thought about so far :

  • queue'd system and daemons either wake periodically and copy or run as services.
  • SAN - expensive, complex, more expensive
  • ROBOCOPY, on a timed job - simple but effective. Lots of internal/indeterminate state e.g. where its at in copying, errors
  • Off the shelf repl. software - less expensive than SAN but still expensive
  • UNC shared folders and no repl. Higher latency, lower cost - still need a clustering solution too.
  • DFS Replication.

What else have other folks used?

4

4 回答 4

1

I've used rsync scripts with good success for this type of work, 1000's of machines in our case. I believe there is an rsync server for windows, but I have not used it on anything other than Linux.

于 2008-09-17T18:38:21.120 回答
1

Though we do not have these millions of giga of data to manage, we are sending and collecting lots of files overnight between our main company and its agencies abroad. We have been using allwaysync for a while. It allows folders/ftp synchronization. It has a nice interface that allow folders and files analysis and comparisons, and it can be of course scheduled.

于 2008-09-19T06:52:19.677 回答
1

UNC shared folders and no replication has many downsides, especially if IIS is going to use UNC paths as home directories for sites. Under stress, you will run into http://support.microsoft.com/default.aspx/kb/810886 because of the number of simultaneous sessions against the server sharing the folder. Also, you will experience slow IIS site startups since IIS is going to want to scan/index/cache (depending on IIS version and ASP settings) the UNC folder.

I've seen tests with DFS that are very promising, exhibition none of the above restrictions.

于 2008-09-23T15:10:30.373 回答
0

We use ROBOCOPY in my organization to pass files around. It runs very seamlessly and I feel it worth a recommendation.

Additionally, you are not doing anything too crazy. If you are also proficient in perl, I am sure you could write a quick script that will fulfill your needs.

于 2008-09-17T18:39:49.727 回答