3

我尝试通过 API 对字段值进行大规模更新,但我遇到了 PHP 脚本的最大执行时间。

我将我的工作分成较小的任务,以作为较小的工作异步运行它们......

异步 PHP 调用?

我找到了这篇文章,看起来不错,但评论有点令人反感……使用 curl 运行外部脚本文件会阻止调用者文件触发最大执行时间,还是 curl 仍会等待来自服务器的响应和杀死我的页面?

真正的问题是:如何在 PHP 中执行异步作业?有点像阿贾克斯。

编辑::///

有一个项目管理工具,其中包含大量数据行。我正在使用这个工具 API 来访问数据行并将它们显示在我的页面上。使用我的工具的用户将使用复选框选择多行数据,然后在框中键入一个新值。然后用户将按下运行更新脚本的“更新行值”按钮。

此更新脚本将可能选择的数百或数千个项目分成 100 个组。

此时,我打算使用一些异步方法来联系项目管理工具并更新所有 100 项。

因为在更新这些项目时,该服务器可能需要很长时间才能运行其进程,所以我需要确保拆分这些作业的原始页面不再等待来自该操作的请求,以便我可以启动更多请求更新项目。并允许我的服务器页面对我的用户说“好的,更新正在进行中,可能需要一段时间,一旦完成,我们将发送一封电子邮件”。

    $step = 100;
    $itemCount = GetItemCountByAppId( $appId );
    $loopsRequired = $itemCount / $step;            
    $loopsRequired = ceil( $loopsRequired );

    $process = array();

    for( $a = 0; $a < $loopsRequired; $a++ )
    {
        $items = GetItemsByAppId( $appId, array( 
            "amount" => $step, 
            "offset" => ( $step *  $a ) 
        ) );  

        foreach( $items[ "items" ] as $key => $item )
        {
            foreach( $fieldsGroup as $fieldId => $fieldValues )
            {
                $itemId = $item->__attributes[ "item_id" ];
                /*array_push( $process, array(
                    "itemId" => $itemId,
                    "fieldId" => $fieldId,
                ) );*/
                UpdateFieldValue( $itemId, $fieldId, $fieldValues );
                // This Update function is actually calling the server and I assume it must be waiting for a response... thus my code times out after 30 secs of execution
            }
        }  

        //curl_post_async($url, $params);
    }
4

2 回答 2

0

如果您使用的是 PHP-CLI,请尝试Threadsfork()用于非线程安全版本。

于 2013-05-20T11:24:20.190 回答
0

Depending on how you implement it, asynchronous PHP might be used to decouple the web request from the processing and therefore isolate the web request from any timeout in the procesing (but you could do the same thing within a single thread). Will breaking the task into smaller concurrent parts make it run faster? Probably not - usually this will extend the length of time it takes for the job to complete - about the only time this is not the case is when you've got a very large processing capacity and can distribute the task effective (e.g. map-reduce). Are HTTP calls (curl) an efficient way to distribute work like this? No. There are other methods, including synchronous and asynchronous messaging, batch processing, process forking, threads....each with their own benefits and complications - and we don't know what the problem you are trying to solve is.

So even before we get to your specific questions, this does not look like a good strategy.

Will using curl to run external script files prevent the caller file triggering maximum execution time

It will be constrained by whatever timeouts are configured on the target server - if that's the same server as the invoking script, then it will be the same timeouts.

will the curl still wait for a response from the server and kill my page?

I don't know what you're asking here - it rather implies that there are functional dependenciesyou've not told us about.

It sounds like you've picked a solution and are now trying to make it fit your problem.

于 2013-05-20T11:41:38.037 回答