1

I try to make a quite simple computation but I get an out of memory message whose origin is not clear to me. Here is what I want: for the small-world model (watts.strogatz.game) and several p-values I create nsamp many graphs and compute the distance between the first node and the opposite node at n/2. When nsamp = 1, everything works fine. Setting it to 100 or larger gives an "out of memory" error right in the first round (i=1, j=5), i.e., for the smallest p-value (0.01). Setting nsamp=10 gives no error (huh?? Shouldn't that include i=1, j=5?). I tried to explicitly remove all the larger objects (graph, len) but to no avail. Any ideas here? Here is the code:

require(igraph)
probs <- c(1:25)*0.01
n = 1000 
target <- n/2    
nsamp = 100


for(i in c(1:length(probs))){
   for(j in c(1:nsamp)){
      graph <- watts.strogatz.game(dim=1, size=n, p=probs[i], nei=4)
      shortest.paths(graph, v=V(graph)[1], to=V(graph)[target])

      len <- get.all.shortest.paths(graph, from=V(graph)[1])
      rm(graph)

      #The number of shortest paths between 1 and target can be found in 
      numbPathsBetw1AndTarget <- len$nrgeo[target]

      #In len$res there are all shortest paths between node 1 and all other
      #nodes. It comes as a list which at each point contains a vector of  
      #so we need to use lapply to find out which of 
      #the lists are of interest to us.
      lengths <- unlist(lapply(len$res, function(x) length(x)))

      rm(len)          

   }
}

I am aware that I can increase the memory, but it bugs me that for nsamp being small everything is fine.

Edit: it is so unreproducible that I suspect it is not a memory leak but rather a bad configuration of the graph. It always happens for i=1 but for different j. Thus: is there an intrinsic, upper limit on the diameter of the graph to apply get.all.shortest.paths or on the number of shortest paths?

Edit 2: the unreproducible error vanished after restarting R. Maybe the RAM was already clogged by previous computation.

4

0 回答 0