0

Here's the logging problem.

There is an asp.net website. The main feature of the website is a list of emails (saved in the db). Every time there is an error, data is saved by nlog to a file. The website has an admin panel. The admin panel lists reads the error log of the website.

There is a worker. The worker connects to Exchange Server and it checks for emails with a matching subject. Emails that match a pattern are saved to the db. Every time there is an error, data is saved by nlog to a different file.

Both the website and the worker use the same logger class. The settings in the web.config of the website and the application.config of the worker just point to a different target file.

Here is the new feature. The admin panel should list down the errors of the worker too. Now, I have multiple ways to go about that.

Option 1: The website will figure out where the worker log file is, then it will read the worker log file.

Option 2: Make a new worker/party. This will handle logging for both the website and the worker. It can save both sets of errors in a single log, or maybe even a new table in the database.

Option 3: Configure both the website and the worker file to point to the same target file, although, I do not know the directory structure of the azure file system.

What would be the cleanest way to go about it?

4

2 回答 2

1

What you could do is to make use of custom logs functionality of Windows Azure Diagnostics. Have your web role and worker role write to different or same targets using NLog. These files will be saved on each VM's local storage. Then using Windows Azure Diagnostics, you transfer these files periodically to Windows Azure Blob storage and have a 3rd worker role process these blobs periodically.

Take a look at Cloud Service Fundamentals project here: http://code.msdn.microsoft.com/windowsazure/Cloud-Service-Fundamentals-4ca72649. It's by Windows Azure CAT team. It makes use of NLog for collecting logging data and then persisting that logging data in a container called telemetry-logs. Once the data is in that container, it periodically polls the data and pushes it into a SQL Azure database.

于 2013-09-24T08:57:26.947 回答
1

Send nlog entries to a database instead of a local file.
https://github.com/nlog/NLog/wiki/Database-target
If you have too many entries, just send FATAL or ERROR to database.

于 2013-09-24T09:00:26.200 回答