Caching xml data from remote server - php

Let's say I'm using simpleXML to parse weather data from a remote server, and then the remote server crashes so I can no longer recover its live feeds but I don't want my users to get an error message either, how would I go about caching and continuing to display the last piece of data I got from the server before it crashed?
Let's say my xml looks like this:
<weather>
<temperature c="25" f="77">
</weather>
How would I go about displaying the values "25" and "77" until I'm able to reestablish a connection with the remote server?
Apologies if my question isn't entirely clear... my knowledge of server-side technologies is very limited.

First: You do not want to fetch the remote data live when the user requests your site. That works for small sites with few visitors when no problems occur, but as soon as the remote server hangs, your site will also hang until the connection timeout occurs.
What we mostly do is the following:
Create a script that fetches the remote file and stores it locally in some temporary folder. If the remote file cannot be fetched, do not overwrite the old one. This is very important, and #Drazisil code does exactly this.
Call that script with a cron job, or at the end of your normal script every x minutes
Use the local file when creating your normal HTML output instead of the remote one.
In the end, your pages will always be delivered fast and will not crash when the remote server is down.

This isn't the best way, but here is one way you could do it:
To save the information
$file = 'temp_cache.php';
// Open the file to get existing content
$content = '<?php $c="25"; $f="77"; ?>';
// Write the contents to the file
file_put_contents($file, $content);
To load it
include_once('temp.php');
By including the file, your $c and $f variables will be set unless you overwrite them.

Store the values locally and display this information to your users. Update when you want, in such a way that it will only overwrite the local copy when successful; If it fails, you will have your 'local' copy.

Related

#only server-side# How to get the echo-html-div-result of the php code saved to png-file on this server?

Like a Log-file is written by a php-script via fwrite($fp, ---HTML---),
I need to save an HTML DIV as png-file on the server.
The client-browser only start the php-script,
but without any client-interaction the png-file should be saved on the server.
Is there a way to do this?
All posts (over thousands) I have been reading are about html2canvas,
which is (as I understand) client-side operating.
I know the html-(html-div)-rendering normally does the browser.[=client-side]
But is there a way to do it in PHP on server-side ?
Reason:
Until now the procedure is
print the div via browser on paper twice
one for the costumer,
one to scan it in again to save it on the server as picture and throw it in the paperbasket.
By more than 500 times a day ...
By security reasons it need to be a saved picture on the server.

Least disruptive way to download file using PhP. Prevent disrupting of ongoing updating of it

I am working on a website in which it would be useful to to allow a user the option of downloading the content of a file, even when it's going to be updated by another user at the same time or later.
My problem is that the solution I've tried so far allows downloading, but will disrupt any later updating of the file. I don't think I can represent the code relating to the updating concisely (it is spread over multiple files), except that it's through AJAXing the data (which I'm not sure why it would cause this problem). In case it's relevant, this is a file which gets updated multiple times.
When I use fireftp I can download the file without disrupting this process, which makes me optimistic there's a PhP solution. I am currently downloading the data by Ajaxing the file contents to the page the "downloading user" is on. The code for this (within php) is:
$file_contents = file_get_contents ($_POST['file'])); //file address comes through Ajax POST request.
echo ($file_contents); //to access the content client side
Is there another way to access the text/content within a file without any unintended consequences on other server processing of it?

Echo script progress and download CSV

I'm having problems sending an array to another PHP page. We send an array from one page to another to generate CSV file that has been transformed from XML. So we take a 800mb XML file and transform it down to a 20mb CSV file. There is a lot of information in it that we are removing and it runs for 30 minutes.
Anyway, we are periodically using a function to output the progress of the transformation in the browser with messages:
function outputResults($message) {
ob_start();
echo $message . "<br>";
ob_end_flush();
ob_flush();
}
$masterArray contains all the information in a associative array we have parsed from the XML.
The array ($masterArray) at the end we send from index.php to another php file called create_CSV_file.php
Originally we used include('create_CSV_file.php') within index.php , but due to the headers used in the CSV file, it was giving us the messages that
Warning: Cannot modify header information - headers already sent
. So we started looking at a solution of pushing the array as below.
echo "<a href='create_CSV_file.php?data=$masterArray'>**** Download CSV file ***</a>";
I keep getting the error message with the above echo :
Notice: Array to string conversion
What is the best method to be able to show echo statements from the server as it is running, then be able to download the result CSV at the end?
Ok, so first of all, using data in a url (GET) has some severe limitations. Older version of IE only supported 4096 byte urls. In addition, some proxies and other software impose their own limits.
I'm sure you've heard this before, but if not.... You should not be running a process that takes more than a couple of seconds (at most!) from a web server. They're not optimised for it. You definitely don't want to be passing megabytes of data to the client just so they can send it back to the server!
How about something like this...
User makes a web request (And uploads original data?) to the server
Server allocates an ID for the request (random? database?) and creates a file on disk using the ID as a name (tmp directory, or at least outside web root)
Server launches a new process (PHP?) to transform the data. As it runs, it can update the database with progress information
During this time, the user can check progress by making a sequence of AJAX requests (or just refreshing a page which shows latest status). Lots more control over appearance now
When the processing is complete, server-side process writes results to file, updates database to indicate completion.
Next time user checks status, redirect them to a PHP file that takes the ID and will read the file from disk / stream it to the user.
Benefits:
No long-running http requests
No data being passed back/forth to client in intermediate stage
Much more control over how users see progress
Depending on the tranformation you're applying / the detail stored in the database, you may be able to recover interrupted jobs (server failure)
It does have one downside which is that you need to clean up after yourself - the files you created on disk need to be deleted, however, you've got a complete audit of all files in the database and deleting anything over x days old would be trivial.

MySQL Server has gone away PHP Upload/file_put_contents

Hey everyone this issue is giving me gray hair, last week this was all working fine, Monday when I get into work and try running another test it starts failing.
Backround:
I am building a utility for work that will allow the video guys to either upload a file from their system or specify the URL of a video whether it be in Amazon S3 or wherever. The utility will then either upload the video store it and put an entry in the DB to keep track of it, or move it from the URL provided to the dir they are being stored in and do the same with the mysql.
The Issue:
It has recently became an issue where when moving from a URL I get an error stating that the "MySQL Server has gone away" lots of searching has ended up with lots of issues on timeouts, packet size etc. But, the MySQL connection isn't even being opened until after the file has been moved into its directory. When moving a file from a URL I get the error in under 10 seconds. The only tests I can get to go through is when I specify a URL of a file that is really small, like 5mb.
Last week I was able to run successful tests on files up over 500mb (pulling from URL). Now the files will still move over, but I get the server has gone away error.
Here is the code I am using
file_put_contents('vid_bin/'.$fName, fopen($url, 'r'));
$qry = "INSERT INTO videos (".$fields.") VALUES (".$vals.")";
$db = new mydb;
$db->mydb; //connects
$db->select_db("thedbname"); //selects db
$db->query($qry); //runs query
if($db->error) {
die($db->error . "\n". $qry); //oh noes
} else {
...
}
Our server guy hasn't been available so I am left wondering if its my code or if its something that changed on the server. I am pretty sure its not the code but wanted outside opinions on what could be causing MySQL to freak out despite only having strings, dates and ints stored in it.
UPDATE:
If I process the MySQL bits before messing with the file all works well. The problem is, I have the file handling bit of code in a try/catch because if someone provides an invalid URL or something goes haywire I don't want to go back and remove the MySQL record.
Any idea why MySQL would care if the file is handled before it? I was thinking it might have something to do with packet size in the MySQL config but I am not storing the file in the db. Looks like it is going to be an issue for the Server admin to handle, unless someone has some insight into why this would be happening from the code end of things.
Ends up nothing was changed on the server so I'm not sure why the code stopped working as intended. I just handle the MySQL stuff first, then the file. There is a cron that then does some work on the files based on the SQL results so I just added in a check that the file exists and if it doesn't I remove the record. Not exactly how I wanted it to operate but gets the job done.
increase max_allowed_packet in mysql to 128MB

how do i make foreach echo a ftp with auto-login?

i have a script that allows me to port my in-game chat from Call Of Duty 4 to a small window using php script live from my gaming server so my members and visitors can see in-game chat live from my site.
Only problem is my server is locked by ftp (tcadmin so no way of direct url for file) so i want a way to read the mp_games.log file in the directory
i think only way is to connect through ftp i guess but it'll need to auto-login
the script is: ( this is only a part of the script where im having the problem )
<?php
$maxMessages = 40;
$messageArr = array();
foreach(file('file:///home/serverexample/main/games_mp.log') AS $value) {
if(strstr($value,'say;')) {
$messageArr[]=$value;
}
}
what i need help with is to make the foreach go to the ftp in my site into a specific path where it can read the game_mp.log ..
i was thinking something similar to: ftp://user:pass#ftp.domain.com/serverexample/main/games_mp.log
Can you tell me how to modify the script in order to possibly download it onto a dir in local address but it'll need to be constantly downloading which im afraid might cause lag.
The other way i was thinking of, if possible is to read directly the game_mp.log file
games_mp.log is readable as txt .. and is always being updated by server .. any way to make the foreach(file replaced by ftp so it takes directly the info from the ftp server ?
preferably being able to male the script read the file without having to download it ..
and if there is no other way but downloading it can you guide me through that as well ?
Many thanks hope i made myself clear of what i need
You can login and download the file with the FTP functions of PHP, look at the list of manuals http://www.php.net/ftp
An alternative (not recommend) is WGET: http://www.cyberciti.biz/faq/wget-command-with-username-password/

Categories