Waiting for xml file writing to finish in PHP - php

My PHP knowledge is limited and I'm having a problem with a script I am running.
I am amalgamating quite a few xml feeds and writing the output to my own server in a custom xml file.
I am using XMLWriter to do this, but the problem I am having is knowing when the file has successful finished being written to?
I am loading the external feeds via SimpleXMLElement, and in total the script takes around 10 seconds to run, but when I do a print_r($xmlWriter) at the end of the script it is empty, so too is the xml file Im trying to write.
The xml file is successfully written, but in the content of php im not sure when and would like only to proceed to some other code once this is successful.
Any help appreciated
Thanks

You could check if $xmlWriter is empty, as you say this is the problem, and loop until it is not empty, but at the same time you can tell PHP to wait a while to save performance:
//while the XML hasn't finished being written yet
while ($xmlWriter == "") {
// sleep for 10 seconds
sleep(10);
}
You could even keep a count of how long it waits for and give up after a while and handle the failure.

Related

How XMLReader from PHP library will deal with condition when millions of users reading/searching something from a large XML? Will it hang or process?

I know XMLReader from PHP https://www.php.net/xmlreader can read/search something from XML and return data using file-path.
$xml = new XMLReader();
$xml->open('test.xml');
//Do Something with file test.xml and return searched data
$xml->close();
But I just wanted to know what if millions of user are visiting the page concurrently which uses above function, means open the file return what user wants and then close the file.
Then how this will deal if user load is million?
Will it hang the system of read from same file open at once, without opening that again?
But what function says it will open every-time & close.
I just tried above function and it was returning the data from XML which I needed but I am just confused with their open and close process each time. Then if at same time millions of users are searching different thing from XML that uses same function to return data, how it will handle the scenario.

Display Live Progress and Any Errors During Long Script Execution in CodeIgniter

I'm working on an app that creates QR codes and renders them onto multiple graphics for a user.
The Problem:
I wrote a script to import users to create from a CSV. I'm needing to create over 100 users (each including the process above). Right now it takes roughly 1 minute to complete for each new user to complete the processing.. then spits out all my error/success messages at once.
My Question:
Rather than the browser slowly loading the result view (currently stays on a white page until complete) as my script is processing, is their a somewhat easy way to display the live progress and errors as they happen? Something like a progress bar updated as each user is created/fails. I'm guessing it will require AJAX?
When dealing with websites, remember the golden rule.
PHP MUST DIE.
Noobs assume this is people rubbishing PHP. It isn't. It's the HTTP request cycle.
Request In > PHP > Response Out > PHP process dies.
This is only the case when dealing with web servers and browsers, not CLI PHP. But the point is that you may end up getting Apache timeouts if your script takes as long as you say.
One solution could be to set up a cron that checks for a file and if it finds it, processes it, dumping a line number in a text file that your browser could check, which means you could fetch progress:
<?php
if (file_exists('/some/csv/to/process.csv')) {
// open file
// get row to work on
// process row
// update progress file with next line number
}
Meanwhile, you could set up a script that does this:
<?php
$progress = file_get_contents('/path/to/progress.txt');
header('Content-Type: application/json');
echo json_encode(['progress' => $progress]);
And then get the progress using AJAX inside a setInterval function:
$.get('/path/to/progress/json/page', function(data){
console.log(data);
});
Just an idea, may or may not suit you but give it a try!

Simple HTML Dom save file with Cron Job once a day, then access that saved file

I am using SimpleHTMLDom to get some info off of a RSS feed. This data is only updated once a day around 7am. I would like to use the feature $html->save('result.htm'); Then have my page just load the result.htm file instead of running the parse each time I look at the page.
I guess I am wondering, would this be a good idea? Would it really speed the page load time up that much? Would using a cache be similar or maybe better?
(this question almost address this)
yes, it would be a good idea and you couldn't get any faster (unless you load the page to webserver memory and serve it from there).
just extend the cronjob you already have to process the data with SimpleHTMLDom and save the html it produced at 7am. Then keep serving that file until the next morning.
Just make sure you create a tmp-file first (result.tmp.html) the next morning and only do the move/rename once the cronjob finishes.
i am not sure i told you anything you didn't know already...

PHP output messages while script in process

For example i'm looping through a big file, and after counter reaches 1000 parsed strings i need to echo message, that 1000 string have been parsed and calculate % of overall completed strings.
Is it possible to make something like that with output buffer?
Take a look at flush(). Whether or not your browser will render the incomplete page, or wait till it finishes loading is entirely implementation-dependent, though...
Make your script to write the progress data in a text file on the server. Now program your webpage with help of Ajax to send request to that file in particular intervals of time. Get the data and calculate the percentage and modify the HTML of your page.
One possiblity is to use another script to output the progress, and have the client poll it in set intervals for current progress, and only ask for the complete output after the whole process is complete.

PHP database simulation

I have a PHP script that works by calling items from a database based upon the time they were placed in there and it deletes them if they are older than 5 minutes. Basically, I want to now simulate what would happen if this database was being updated regularly.
So I was considering sticking in some code that loads an XML file then goes through and parses that into the database based upon the time data located within a node of the xml data... but the problem there is I want it to continually loop through an enter this data so it'll never actually run the other processes
So I was thinking of having another PHP script do that that could do this independantly of the php script that is going to display this data...
In theory:
I am looking to have a button that I can press and it will then run some php code to load up an XML file from a directory on my web server and then iterate though the data sending the data, to a database, based upon the time within a node in the PHP script and when the script was first called
So back to my page that displayed the data... if I continually hit refresh it will contain different results each time because data is being added by the other process and this php script removes the older data when it is refreshed
Any information on this?
Is there a way I can silently, and safely, run a php script without it being loaded into a browser... like a thread!?
Why not just run the PHP script that parses and inserts data into your DB from PHP's CLI?
http://www.php.net/manual/en/features.commandline.usage.php

Categories