(I'm not sure that the graphs you linked to qualify as an intensive task, but regardless)
I've gotten good results by breaking up the tasks using timeouts. Say you're doing something like this:
var largeSelection = d3.selectAll('svg circle')
.data(aReallyLargeDataset);// Expensive Bind Operation
largeSelection.enter()
.append('circle')// Lots of appending
.attr('r', function() { /* expensive calculations */ return ... });
largeSelection// Many refreshes
.attr('cx', function() { /* more expensive calculations */ return ... });
That might take the browser 1 second to render (a long time, considering everything will be frozen during this task). You can make it better by breaking it up like so:
setTimeout(function() {
var largeSelection = d3.selectAll('svg circle')
.data(aReallyLargeDataset);// Expensive Bind Operation
setTimeout(function() {
largeSelection.enter()
.append('circle')// Lots of appending
.attr('r', function() { /* expensive calculations */ return ... });
setTimeout(function() {
largeSelection// Many refreshes
.attr('cx', function() { /* more expensive calculations */ return ... });
}, 100);
}, 100);
}, 100);
Sorry about the obnoxious nesting and timeouts. You could refactor/abstract it in a way that's more readable/scalable. In any case, breaking up the tasks this way gives the browser a chance to "breath" and update the DOM so that, from the user's perspective, the application doesn't seem "stuck".
If that still feels sluggish, you can break things up even more:
var entering = largeSelection.enter()
.append('circle');// Lots of appending
setTimeout(function() {
entering.attr('r', function() { /* expensive calculations */ return ... });
}, 100);