I have a Linq query on a DbSet that hits a table and grabs 65k rows. The query takes about 3 minutes, to me that seems like obviously too much. Although I don't have a line of comparison but I'm certain this can be improved. I'm relative new to EF and Linq so I suspect I may also be structuring my query in a way that is a big "NO".
I read that change tracking is where EF spends most of it's time, and that is enabled on the entity in question so perhaps I should turn that off (if so, how)?
Here's the code:
ReportTarget reportTarget = repository.GetById(reportTargetId);
if (reportTarget != null)
{
ReportsBundle targetBundle = reportTarget.SavedReportsBundles.SingleOrDefault(rb => rb.ReportsBundleId == target.ReportsBundleId);
if (targetBundle != null)
{
}
}
This next line takes 3 Minutes to execute (65k records):
IPoint[] pointsData = targetBundle.ReportEntries
.Where(e => ... a few conditions )
.Select((entry, i) => new
{
rowID = entry.EntryId,
x = entry.Profit,
y = i,
weight = target.HiddenPoints.Contains(entry.EntryId) ? 0 : 1,
group = 0
}.ActLike<IPoint>())
.ToArray();
Note: ActLike() is from Impromptu Interface library that uses the .NET DLR to make dynamic proxies of objects that implement an interface on the fly. I doubt this is the bottle neck.
How can I optimize performance for this particular DbSet (TradesReportEntries
) as I'll be querying this table for large data sets (IPoint[]
s) often