1

我正在使用以 GeoJSON 作为输入的 javascript 地图来可视化年度英国电影放映。你可以在这里看到它适用于 2011 年的数据:http: //screened2011.herokuapp.com

GeoJSON 的生成效率非常低 - 每个“图块”通常需要 5 秒以上。

我有一个 Ruby 应用程序,它在 MongoDB 中查询边界框内的一组“筛选”(由 JS 请求),然后生成一个二维数组,表示在每个 16x16 中发生的筛选总数。这显然是瓶颈——它正在攻击服务器并拉下所有这些放映。

我想用一个 map/reduce 查询替换它,该查询将边界框中的所有筛选计数聚合到一个 16x16 的值数组中,但我没有取得太大的成功——这是我第一个 map/reduce 的任务!

这是我的代码的简化版本,其中删除了不相关的内容(这很糟糕,如果这不是即将结束的黑客攻击,我会重构):

get :boxed, :map => "/screenings/year/:year/quadrant/:area/bbox/:bbox", :provides => [:json, :jsonp], :cache => false do
    box = parameter_box(params[:bbox]) # returns [[minx,miny],[maxx,maxy]]
    year = params[:year].to_i
    screenings = Screening.where(:location.within(:box) => box).where(:year => year)
    jsonp Screening.heatmap(screenings, box, 16)
 end

def self.heatmap screenings, box, scale
  grid = []
  min_x = box[0][0]
  min_y = box[0][1]
  max_x = box[1][0]
  max_y = box[1][1]
  box_width = max_x.to_f - min_x.to_f
  box_height = max_y.to_f - min_y.to_f

  return [] if box_width == 0 || box_height == 0

  # Set up an empty GeoJSON-style array to hold the results
  scalef = scale.to_f
  (0..(scale - 1)).each do |i| 
    grid[i] = []
    (0..(scale - 1)).each do |j| 

      box_min_x = min_x + (i * ( box_width / scalef  ))
      box_max_x = min_x + ((i + 1) * ( box_width / scalef  ))
      box_min_y = min_y + (j * ( box_height / scalef  ))
      box_max_y = min_y + ((j + 1) * ( box_height / scalef  ))

      grid[i][j] = { 
        :count => 0,
        #:id => "#{box_min_x}_#{box_max_x}_#{box_min_y}_#{box_max_y}",
        :coordinates => [
          [
            [box_min_x,box_min_y], [box_min_x, box_max_y], [box_max_x, box_max_y], [box_max_x, box_min_y], [box_min_x,box_min_y]
          ]
        ]
      } 
    end
  end

  # This loop is the bottleneck and I'd like to replace with a map-reduce
  screenings.only(:location, :total_showings).each do |screening|
    x = (scale * ((screening.location[0] - min_x) / box_width)).floor
    y = (scale * ((screening.location[1] - min_y) / box_height)).floor
    raise if x > (scale - 1)
    raise if y > (scale - 1)
    grid[x][y][:count] += screening.total_showings.to_i
  end

  # This converts the resulting 16x16 into GeoJSON
  places = []
  grid.each do |x|
    x.each do |p|
      if p[:count].to_i > 0
        properties = {}
        properties[:total_showings] = p[:count]
        places << {
          "id" => p[:id],
          "type" => "Feature",
          "geometry" => {
            "type" => "Polygon",
            "coordinates" => p[:coordinates]
          },
          "properties"=> properties
        }
      end
    end
  end

  {
    "type" => "FeatureCollection",
    "features" => places
  }
end

我正在使用 Mongoid,所以我可以将 mapreduce 链接到筛选查询,我希望这会大大加快这个过程 - 但是我应该如何让类似下面的东西传递给这个函数呢?:

[
  [1,20000,30,3424,53,66,7586,54543,76764,4322,7664,43242,43,435,32,643],
  ...
]

...基于此结构中的几百万条记录(基本上是对边界框内的每个记录求和):

{"_id"=>BSON::ObjectId('50e481e653e6dfbc92057e8d'),
 "created_at"=>2013-01-02 18:52:22 +0000,
 "ended_at"=>Thu, 07 Jun 2012 00:00:00 +0100,
 "events"=>["10044735484"],
 "film_id"=>BSON::ObjectId('4f96a91153e6df5ebc001afe'),
 "genre_ids"=>[],
 "location"=>[-2.003309596016, 52.396317185921],
 "performance_id"=>"9001923080",
 "specialised"=>false,
 "started_at"=>Fri, 01 Jun 2012 00:00:00 +0100,
 "total_showings"=>1,
 "updated_at"=>2013-01-02 18:52:22 +0000,
 "venue_id"=>BSON::ObjectId('4f9500bf53e6df004000034d'),
 "week"=>nil,
 "year"=>2012}

先谢谢各位了!

4

0 回答 0