9

I got a Function that returns a Collection<string>, and that calls itself recursively to eventually return one big Collection<string>.

Now, i just wonder what the best approach to merge the lists? Collection.CopyTo() only copies to string[], and using a foreach() loop feels like being inefficient. However, since I also want to filter out duplicates, I feel like i'll end up with a foreach that calls Contains() on the Collection.

I wonder, is there a more efficient way to have a recursive function that returns a list of strings without duplicates? I don't have to use a Collection, it can be pretty much any suitable data type.

Only exclusion, I'm bound to Visual Studio 2005 and .net 3.0, so no LINQ.

Edit: To clarify: The Function takes a user out of Active Directory, looks at the Direct Reports of the user, and then recursively looks at the direct reports of every user. So the end result is a List of all users that are in the "command chain" of a given user.Since this is executed quite often and at the moment takes 20 Seconds for some users, i'm looking for ways to improve it. Caching the result for 24 Hours is also on my list btw., but I want to see how to improve it before applying caching.

4

5 回答 5

17

If you're using List<> you can use .AddRange to add one list to the other list.

Or you can use yield return to combine lists on the fly like this:

public IEnumerable<string> Combine(IEnumerable<string> col1, IEnumerable<string> col2)
{
    foreach(string item in col1)
        yield return item;

    foreach(string item in col2)
        yield return item;
}
于 2008-09-11T09:06:44.990 回答
1

I think HashSet<T> is a great help.

The HashSet<T> class provides high performance set operations. A set is a collection that contains no duplicate elements, and whose elements are in no particular order.

Just add items to it and then use CopyTo.


Update: HashSet<T> is in .Net 3.5

Maybe you can use Dictionary<TKey, TValue>. Setting a duplicate key to a dictionary will not raise an exception.

于 2008-09-11T09:03:29.943 回答
1

You might want to take a look at Iesi.Collections and Extended Generic Iesi.Collections (because the first edition was made in 1.1 when there were no generics yet).

Extended Iesi has an ISet class which acts exactly as a HashSet: it enforces unique members and does not allow duplicates.

The nifty thing about Iesi is that it has set operators instead of methods for merging collections, so you have the choice between a union (|), intersection (&), XOR (^) and so forth.

于 2008-09-11T09:13:14.367 回答
1

Can you pass the Collection into you method by refernce so that you can just add items to it, that way you dont have to return anything. This is what it might look like if you did it in c#.

class Program
{
    static void Main(string[] args)
    {
        Collection<string> myitems = new Collection<string>();
        myMthod(ref myitems);
        Console.WriteLine(myitems.Count.ToString());
        Console.ReadLine();
    }

    static void myMthod(ref Collection<string> myitems)
    {
        myitems.Add("string");
        if(myitems.Count <5)
            myMthod(ref myitems);
    }
}

As Stated by @Zooba Passing by ref is not necessary here, if you passing by value it will also work.

于 2008-09-11T09:16:40.490 回答
0

As far as merging goes:

I wonder, is there a more efficient way to have a recursive function that returns a list of strings without duplicates? I don't have to use a Collection, it can be pretty much any suitable data type.

Your function assembles a return value, right? You're splitting the supplied list in half, invoking self again (twice) and then merging those results.

During the merge step, why not just check before you add each string to the result? If it's already there, skip it.

Assuming you're working with sorted lists of course.

于 2008-09-11T11:49:33.540 回答