FUDforum
Fast Uncompromising Discussions. FUDforum will get your users talking.

Home » Imported messages » comp.lang.php » Output status during processing
Show: Today's Messages :: Polls :: Message Navigator
Return to the default flat view Create a new topic Submit Reply
Re: Output status during processing [message #179534 is a reply to message #179527] Mon, 05 November 2012 13:22 Go to previous messageGo to previous message
Thomas 'PointedEars'  is currently offline  Thomas 'PointedEars'
Messages: 701
Registered: October 2010
Karma:
Senior Member
Scott Johnson wrote:

> What I would like to do is provide an output to the browser during a
> long processing term.
>
> Exactly what I am doing is this.
>
> I am loading a csv file to update inventory.
>
> During this process I am flushing the current Table and then loading the
> new and then comparing the inventory table to a product table and would
> like to provide status of where we are in the process what items matched
> and changed and so forth in real time.
>
> I have done some similar tasks in JS but with minimal steps in the
> processing then what I am trying to accomplish.
>
> I did some binging and found that it may have to do with output
> buffering but nothing particular to my situation. (as far as I could find)
>
> If this is possible could someone point me in the right direction and I
> will do the leg work.

Contrary to popular belief, it is possible with PHP alone to provide step-
by-step feedback to the client (as well as to a server-side log file) while
a server-side script is being executed.

About one year ago, I have done something very similar to what you want to
do: I have written a PHP script to import product information from a CSV
file into a Magento eCommerce database (in a way that Magento's built-in
import feature was not capable of at the time).

Here is basically what I have done:

1. Enable PHP's output buffering (ob_start()).

2. Process a record in the CSV (with fgetcsv() and the Magento API),
writing status information to the server-side log file and (optionally)
to the standard output (with file_put_contents() and `echo',
respectively).

3. Flush and clear the output buffer:

/**
* Flushes all output buffers and restarts output buffering
*
* @author sebastian(at)jcompare(dot)com
* @link http://php.net/manual/en/function.ob-flush.php
*/
function flush_buffers()
{
ob_end_flush();
ob_flush();
flush();
ob_start();
}

That displays what is written in step 2 in the console or the browser
window (depending on how the script is run). In order to have the last
row visible, client-side scripting is used to scroll down to it in
regular intervals.

4. Continue with step 2 if there are more records or files;
exit successfully otherwise.

That has worked beautifully both with text/plain and text/html.

The output-buffering approach does _not_ provide you with the ability to
update arbitrary parts of an (X)HTML document, though; it would create the
response document content on the fly instead. If you need arbitrary display
updates, you should look into asynchronous XMLHttpRequest, with which you
would poll status information from the server accessing a (PHP-)generated
resource in regular intervals, and update your (X)HTML document accordingly.
(The output-buffering approach was adequate and sufficient in my case; the
XHR approach was and is employed by the Magento UI but was not sufficient
for the used data format at the time.)

Also, you should be aware that no matter how you do it, neither a database
connection is likely to be stable indefinitely nor a PHP script is likely to
run forever. In particular, on the server my script ran, the import was
supposed to run regularly (from CSV files that were uploaded by the customer
at arbitrary times, with the provision that the shop information needed to
be as up-to-date as possible). IOW, a cron job was set up which started PHP
CLI to run the PHP script if it not already ran.

But because a CSV file could have had many (thousands of) records (read:
product variants), and there were several such files, it was not possible to
import all records from all files in one run. Either the database
connection used by the Magento API was severed or the PHP script would be
terminated early (because of max_execution_time in php.ini, which for
security reasons could not be set to 0 for the Web version that was still
needed for "manual" import).

I have solved this with storing the position of the last import in another
file, so that when the script was executed next time it would check if a
previous position was saved and continue from there, or it would continue
from the beginning of the file(s).


HTH

PointedEars
--
When all you know is jQuery, every problem looks $(olvable).
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: adding a method to a built-in class ?
Next Topic: Help required with UPDATE columns
Goto Forum:
  

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ]

Current Time: Wed Nov 27 01:16:03 GMT 2024

Total time taken to generate the page: 0.04089 seconds