I have a service that pulls emails for parsing. Each email is parsed by multiple visitors (all implementing a simple IEmailVisitor
interface with one method: void Visit(VisitableEmail email)
. For some background context, visitor implementations include a SubjectVisitor, BodyVisitor, SummaryVisitor and so on.
The service has an IList<IEmailVisitor>
which gets created once on startup, and then reused in a timer event in this manner:
foreach (var email in emailsToParse)
{
foreach (var visitor in _visitors)
{
email.Accept(visitor);
}
}
The Email class has this method: public void Accept(IEmailVisitor visitor) { visitor.Visit(this);}
As each visitor is visited, properties are set (or changed) on the email instance itself.
There can be quite a few emails to process. My question is, Would I be safe converting the above code to:
Parallel.ForEach(emailsToParse, email =>
{
foreach (var visitor in _visitors)
email.Accept(visitor);
});
None of my visitors maintain state between invocations of Visit(this)
. I'm sure this question reflects my fairly superficial knowledge of task parallelism, but despite the reading I've been doing, I am unsure if this would be a safe approach (assuming there are enough emails each time to justify the operation).