0

I currently getting a 2 million records from different tables to generate a url to create a sitemap. The script eat too much resources and use 100% of the servers performance

query

 SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
    UNION ALL
    Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
    From city c 
    Join region r On r.id = c.id_region 
    Join country country On country.id = c.id_country
    cross join param p
    Where country.used = 1
    And p.active = 1 

//i store it on an array $url_list then process for creating a sitemap..but it takes time and to much resources

//i tried to get the data by batch using LIMIT 0,50000 but getting the maxrow for paging takes time. also the code doesn't look good for i have to run a two query that has a large data

$url_list = array();


$maxrow = SELECT COUNT(*) AS max from (
 SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
        UNION ALL
        Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
        From city c 
        Join region r On r.id = c.id_region 
        Join country country On country.id = c.id_country
        cross join param p
        Where country.used = 1
        And p.active = 1) as tmp

$limit = 50,000;
$bybatch = ceil($maxrow/$limit);
$start = 0;
for($i = 0;$i < $bybatch; $i++){
   // run query and store to $result
       (SELECT CONCAT("/url/profile/id/",u.id,"/",nickname) as url FROM users AS u
        UNION ALL
        Select CONCAT("url/city/", c.id, "/paramId/",p.id,"/",Replace(p.title, " ", "+"),"/",r.region_Name,"/",c.city_Name) AS url
        From city c 
        Join region r On r.id = c.id_region 
        Join country country On country.id = c.id_country
        cross join param p
        Where country.used = 1
        And p.active = 1 LIMIT $start,$limit); 

     $start += $limit;
     //push to $url_list
     $url_list = array_push($result);
}

//when finish i use this to create a site map

$linkCount = 1;
        $fileNomb = 1;
        $i = 0;
foreach ($url_list as $ul) { 

            $i += 1; 
            if ($linkCount == 1) {
                $doc  = new DOMDocument('1.0', 'utf-8');
                $doc->formatOutput = true;
                $root = $doc->createElementNS('http://www.sitemaps.org/schemas/sitemap/0.9', 'urlset');
                $doc->appendChild($root);
            }


            $url= $doc->createElement("url");
            $loc= $doc->createElement("loc", $ul['url']); 
            $url->appendChild($loc);
            $priority= $doc->createElement("priority",1); 
            $url->appendChild($priority);


            $root->appendChild($url);

            $linkCount += 1;

            if ($linkCount == 49999) { 
                $f = fopen($this->siteMapMulti . $fileNomb .'.xml', "w");
                fwrite($f,$doc->saveXML());
                fclose($f);

                $linkCount = 1;
                $fileNomb += 1;
            }

        }

Any better way to do this? or to speed up the performance?

Added

Why is this faster than sql query but consumes 1 hundred percent of the servers resources and performance

$this->db->query('SELECT c.id, c.city_name, r.region_name, cr.country_name FROM city AS c, region AS r, country AS cr  WHERE r.id = c.id_region AND cr.id = c.id_country AND cr.id IN (SELECT id FROM country WHERE use = 1)');

$arrayCity = $this->db->recordsArray(MYSQL_ASSOC);

 $this->db->query('SELECT id, title FROM param WHERE active = 1');

$arrayParam = $this->db->recordsArray(MYSQL_ASSOC);

foreach ($arrayCity as $city) {
        foreach ($arrayParam as $param) {
          $paramTitle = str_replace(' ', '+', $param['title']);
          $url = 'url/city/'. $city['id'] .'/paramId/'. $param['id'] .'/'. $paramTitle .'/'. $city['region_name'] .'/'. $city['city_name'];
          $this->addChild($url);
        }
}
4

1 回答 1

1

我建议您不要使用UNION,只需发出两个单独的查询。它将加速查询本身。此外,正如您在上面提到的,分批获取数据是个好主意。

最后,不要收集内存中的所有数据。立即将其写入循环中的文件。

只需在开头打开文件,在循环中写入每个 URL 条目,最后关闭文件。

— 打开文件进行写入

--count 查询用户表

— 在循环中做几个选择LIMIT(就像你已经做的那样)

— 在循环中将while ($row = mysql_fetch_array())每一行写入文件

而不是对另一个表重复这种算法。实现一个将数据写入文件的函数会很有用,因此您可以调用该函数并遵守DRY原则。

于 2013-10-01T04:12:16.570 回答