my project has me needing to read a csv file and display in a browser automatically. Before I post the code, I want to confirm I have the logic correct and not be confusing myself with more development then necessary. From my research there is 2 ways this can be done at a bare basic level.
Cross domain: A program (R) on server 1 has outputted a csv file on some set time interval. I then need to use a server side language (php) on server 1 to parse the data and put into an array. I then use a php proxy or JSONP format on server 2 for a cross domain GET to call it via AJAX and load into the client side script on server 2.
Same domain: A program (R) on the server has outputted a csv file on some set time interval. I would still need to use a php script to parse the data and put data into an array, which then I do an AJAX call to load the data into the client side script in JS.
I cannot use jquery-csv plugin and HTML5 FileReader to do so automatically in either case because that is for a client user manually uploading a file?
Also, to have a 2 way connection whereby data is push and pull I need to implement websockets or long polling/HTTP streaming.
Please confirm my logic above.
Thanks.
You need to parse CSV on the first server and send parsed data to server 1 (or download to server 1 from server 2)? If so, you just need fgetcsv on server 1 and simple curl/file_read_contents on server 2.
Related
Like a Log-file is written by a php-script via fwrite($fp, ---HTML---),
I need to save an HTML DIV as png-file on the server.
The client-browser only start the php-script,
but without any client-interaction the png-file should be saved on the server.
Is there a way to do this?
All posts (over thousands) I have been reading are about html2canvas,
which is (as I understand) client-side operating.
I know the html-(html-div)-rendering normally does the browser.[=client-side]
But is there a way to do it in PHP on server-side ?
Reason:
Until now the procedure is
print the div via browser on paper twice
one for the costumer,
one to scan it in again to save it on the server as picture and throw it in the paperbasket.
By more than 500 times a day ...
By security reasons it need to be a saved picture on the server.
Need the code in PHP for a similar implementation that you can find below link in Python.
One of the 3rd Party server sends the data in SSE (Server Sent Events), that I need to be captured and save to my database in PHP.
I tried with HTML5 to read the stream and with AJAX call the data can be pushed to a PHP file in turn to DB. But the drawback is we have to keep browser on always.
I tried with fopen the streaming URL and fread to fetch data. But the looping and waiting for event data is where I struck. Need help in PHP code to read the data, so that I can set a cron job of the PHP code to push to DB as and when data receives.
Here is a Python code, I need a similar code in PHP
Reading SSE data in python
I'm having problems sending an array to another PHP page. We send an array from one page to another to generate CSV file that has been transformed from XML. So we take a 800mb XML file and transform it down to a 20mb CSV file. There is a lot of information in it that we are removing and it runs for 30 minutes.
Anyway, we are periodically using a function to output the progress of the transformation in the browser with messages:
function outputResults($message) {
ob_start();
echo $message . "<br>";
ob_end_flush();
ob_flush();
}
$masterArray contains all the information in a associative array we have parsed from the XML.
The array ($masterArray) at the end we send from index.php to another php file called create_CSV_file.php
Originally we used include('create_CSV_file.php') within index.php , but due to the headers used in the CSV file, it was giving us the messages that
Warning: Cannot modify header information - headers already sent
. So we started looking at a solution of pushing the array as below.
echo "<a href='create_CSV_file.php?data=$masterArray'>**** Download CSV file ***</a>";
I keep getting the error message with the above echo :
Notice: Array to string conversion
What is the best method to be able to show echo statements from the server as it is running, then be able to download the result CSV at the end?
Ok, so first of all, using data in a url (GET) has some severe limitations. Older version of IE only supported 4096 byte urls. In addition, some proxies and other software impose their own limits.
I'm sure you've heard this before, but if not.... You should not be running a process that takes more than a couple of seconds (at most!) from a web server. They're not optimised for it. You definitely don't want to be passing megabytes of data to the client just so they can send it back to the server!
How about something like this...
User makes a web request (And uploads original data?) to the server
Server allocates an ID for the request (random? database?) and creates a file on disk using the ID as a name (tmp directory, or at least outside web root)
Server launches a new process (PHP?) to transform the data. As it runs, it can update the database with progress information
During this time, the user can check progress by making a sequence of AJAX requests (or just refreshing a page which shows latest status). Lots more control over appearance now
When the processing is complete, server-side process writes results to file, updates database to indicate completion.
Next time user checks status, redirect them to a PHP file that takes the ID and will read the file from disk / stream it to the user.
Benefits:
No long-running http requests
No data being passed back/forth to client in intermediate stage
Much more control over how users see progress
Depending on the tranformation you're applying / the detail stored in the database, you may be able to recover interrupted jobs (server failure)
It does have one downside which is that you need to clean up after yourself - the files you created on disk need to be deleted, however, you've got a complete audit of all files in the database and deleting anything over x days old would be trivial.
I'm about to implement a REST server (in ASP.NET although I think that's irrelevant here). where what I want to do is the request is made and it returns the result. However, this result is an .XLSX file that could be a million rows.
If I'm generating a million row spreadsheet, it's going to take about 10 minutes. An http request will time out. So what's the best way to handle this delay in the result.
Second, what's the best way to return a very large file as the REST result?
Update: The most common use case is the REST server is an Azure cloud service web worker (basically IIS on Azure). The client is a PHP web app running on a different server in a different location. The PHP web app needs to send up a report template (generally 25K) and the data which can be a connection string to a SQL database, or... could be a 500M XML file. So that is the request, an XML file containing the template and datasource(s).
The response if a file - PDF, DOCX, XLSX, PPTX, or HTML. That can be a BLOB inside an XML file or it can be the file itself. In the case of an error then it must return XML with the error information. The big issue is it can take 10 minutes to generate this file if everything goes right. When it's a 1 million row spreadsheet, it takes time to pull down all that data and populate the created XLSX file. Second issue, this is then a really large file.
So even if everything is perfect, there's a big delay and a large response.
I see two options:
Write file to response stream during its generation (from client side this looks like downloading large file);
Start file generation task on server side and return task id immediatly. Add API methods, that allows retreive task status, cancel it or get results (if task completed).
interesting question,
i sure hope you have a stable connection, anyway, at the client side, in this case, php, set the timeouts to very high values. in php
set_time_limit(3600*10);
curl_setopt($curlh,CURLOPT_TIMEOUT,3600*10);
Let's say I'm using simpleXML to parse weather data from a remote server, and then the remote server crashes so I can no longer recover its live feeds but I don't want my users to get an error message either, how would I go about caching and continuing to display the last piece of data I got from the server before it crashed?
Let's say my xml looks like this:
<weather>
<temperature c="25" f="77">
</weather>
How would I go about displaying the values "25" and "77" until I'm able to reestablish a connection with the remote server?
Apologies if my question isn't entirely clear... my knowledge of server-side technologies is very limited.
First: You do not want to fetch the remote data live when the user requests your site. That works for small sites with few visitors when no problems occur, but as soon as the remote server hangs, your site will also hang until the connection timeout occurs.
What we mostly do is the following:
Create a script that fetches the remote file and stores it locally in some temporary folder. If the remote file cannot be fetched, do not overwrite the old one. This is very important, and #Drazisil code does exactly this.
Call that script with a cron job, or at the end of your normal script every x minutes
Use the local file when creating your normal HTML output instead of the remote one.
In the end, your pages will always be delivered fast and will not crash when the remote server is down.
This isn't the best way, but here is one way you could do it:
To save the information
$file = 'temp_cache.php';
// Open the file to get existing content
$content = '<?php $c="25"; $f="77"; ?>';
// Write the contents to the file
file_put_contents($file, $content);
To load it
include_once('temp.php');
By including the file, your $c and $f variables will be set unless you overwrite them.
Store the values locally and display this information to your users. Update when you want, in such a way that it will only overwrite the local copy when successful; If it fails, you will have your 'local' copy.