How to read SSE data in PHP - php

Need the code in PHP for a similar implementation that you can find below link in Python.
One of the 3rd Party server sends the data in SSE (Server Sent Events), that I need to be captured and save to my database in PHP.
I tried with HTML5 to read the stream and with AJAX call the data can be pushed to a PHP file in turn to DB. But the drawback is we have to keep browser on always.
I tried with fopen the streaming URL and fread to fetch data. But the looping and waiting for event data is where I struck. Need help in PHP code to read the data, so that I can set a cron job of the PHP code to push to DB as and when data receives.
Here is a Python code, I need a similar code in PHP
Reading SSE data in python

Related

#only server-side# How to get the echo-html-div-result of the php code saved to png-file on this server?

Like a Log-file is written by a php-script via fwrite($fp, ---HTML---),
I need to save an HTML DIV as png-file on the server.
The client-browser only start the php-script,
but without any client-interaction the png-file should be saved on the server.
Is there a way to do this?
All posts (over thousands) I have been reading are about html2canvas,
which is (as I understand) client-side operating.
I know the html-(html-div)-rendering normally does the browser.[=client-side]
But is there a way to do it in PHP on server-side ?
Reason:
Until now the procedure is
print the div via browser on paper twice
one for the costumer,
one to scan it in again to save it on the server as picture and throw it in the paperbasket.
By more than 500 times a day ...
By security reasons it need to be a saved picture on the server.

If a REST request can take 10 minutes

I'm about to implement a REST server (in ASP.NET although I think that's irrelevant here). where what I want to do is the request is made and it returns the result. However, this result is an .XLSX file that could be a million rows.
If I'm generating a million row spreadsheet, it's going to take about 10 minutes. An http request will time out. So what's the best way to handle this delay in the result.
Second, what's the best way to return a very large file as the REST result?
Update: The most common use case is the REST server is an Azure cloud service web worker (basically IIS on Azure). The client is a PHP web app running on a different server in a different location. The PHP web app needs to send up a report template (generally 25K) and the data which can be a connection string to a SQL database, or... could be a 500M XML file. So that is the request, an XML file containing the template and datasource(s).
The response if a file - PDF, DOCX, XLSX, PPTX, or HTML. That can be a BLOB inside an XML file or it can be the file itself. In the case of an error then it must return XML with the error information. The big issue is it can take 10 minutes to generate this file if everything goes right. When it's a 1 million row spreadsheet, it takes time to pull down all that data and populate the created XLSX file. Second issue, this is then a really large file.
So even if everything is perfect, there's a big delay and a large response.
I see two options:
Write file to response stream during its generation (from client side this looks like downloading large file);
Start file generation task on server side and return task id immediatly. Add API methods, that allows retreive task status, cancel it or get results (if task completed).
interesting question,
i sure hope you have a stable connection, anyway, at the client side, in this case, php, set the timeouts to very high values. in php
set_time_limit(3600*10);
curl_setopt($curlh,CURLOPT_TIMEOUT,3600*10);

Code Logic to display .csv file on a browser

my project has me needing to read a csv file and display in a browser automatically. Before I post the code, I want to confirm I have the logic correct and not be confusing myself with more development then necessary. From my research there is 2 ways this can be done at a bare basic level.
Cross domain: A program (R) on server 1 has outputted a csv file on some set time interval. I then need to use a server side language (php) on server 1 to parse the data and put into an array. I then use a php proxy or JSONP format on server 2 for a cross domain GET to call it via AJAX and load into the client side script on server 2.
Same domain: A program (R) on the server has outputted a csv file on some set time interval. I would still need to use a php script to parse the data and put data into an array, which then I do an AJAX call to load the data into the client side script in JS.
I cannot use jquery-csv plugin and HTML5 FileReader to do so automatically in either case because that is for a client user manually uploading a file?
Also, to have a 2 way connection whereby data is push and pull I need to implement websockets or long polling/HTTP streaming.
Please confirm my logic above.
Thanks.
You need to parse CSV on the first server and send parsed data to server 1 (or download to server 1 from server 2)? If so, you just need fgetcsv on server 1 and simple curl/file_read_contents on server 2.

PHP make real time stock exchange application

I have a software which give me stock data as excel format , the data is automatically update continuously in every second.i have to show these data in web page such like they are shw in excel (ie the web data should be also update in such manner ) and these data. how it is be done.
Programatically export the data into CSV format and import it into a relational database.
Extract the data with web language and display in webpage. Tutorials for these steps should all be available.
To convert from xls to csv see the question...
converting an Excel (xls) file to a comma separated (csv) file without the GUI
For the second part, you can have a cron job run a PHP script that reads in the csv file contents and inserts this into a database. Plenty of threads on this also.
To display, select from database and format appropriately, can follow any of the basic tuts on the net for this part.
Post your code if you get stuck :)
As you've been told, use PHPExcel to read Excel file.
However, refreshing data every second is gonna make a very heavy load on your server.
I'd recommend you rather use server side push using Comet technologies. Take a look at Meteor server.
You will accomplish 'persistent' connection, so server will push data to the client, and the need to refresh the page or create ajax request every second will be gone.
You've tagged this PHP, so I assume that's your scripting language of choice: use PHPExcel to read the Excel file and write it out as formatted HTML to your web page. PHPExcel's HTML Writer will retain all the style and formatting of the original Excel workbook.
However, an update every second is pretty extreme for a web page, and for a scripting language. Rather than reading the Excel file whenever the page is requested, run this as a background task converting the Excel to a static HTML file whenever an Excel file is received, and serve the latest static HTML.
If this extreme timing is needed, then you might be better looking at a compiled language, or even a non-web solution.

PHP fgets "noblock"?

I'm trying to build a little command-line IRC client in PHP because I'm getting sick of all those clients having you click through twenty GUI popups/windows to connect to a new server.
Everything is working so far, but I'm stuck with the main loop that sends my input commands/messages to the server and receives the new data from it.
As PHP isn't very multi-task-friendly I have two autonomous PHP scripts running at the same time:
The input reader where I can enter my messages - it stores the current message in a text file.
The server listener/writer which receives new data and reads and clears the text file where the input reader stored my current command in.
However fgets() which I use to read new data from the server pauses the script waiting until something new arrives.
So the input text file can't be read out until something new arrives from the server, which is not good.
Is there some special function/option to help me out?
You need to look at streams, and especially stream_set_blocking.
EDIT: in fact, you can get rid of having two processes and do everything in one process. Use non-blocking reads and you should be fine.

Categories