I'm trying to get Realtime info (Speed, Downloaded, Left) of File Downloads made by a script coded in PHP & Curl.
"curl_getinfo" gives all the required data but it gives it only after the download.
Anyway to get it realtime ?
AFAIK there is no way to do this with cURL. The curl_multi_* functions will let you make a number of asynchronous requests and check their current state, but only in so much as they will tell you whether they are completed/what their current errorlevel is.
You cannot get information about speed/downloaded/left from cURL. You would have to write your own HTTP request logic using fsockopen() or similar, then you can include logic to update some kind of display somewhere as the request progresses. This does have the disadvantage of being much more difficult to make asynchronous - because of PHP's lack of multi-thread support you would have to use exec() or pcntl_fork() and make some kind of horrible IPC architecture.
I have done stuff like this before, and my honest opinion is that it is not worth effort. If you still want to go ahead and do it, I will dig out some of the stuff I used when I did it.
Related
I have created many websites that get all there data from a API.
(located on the same server on most cases).
The websites are operation really slow, of all the curl requests.
I first thought it was our mysql server (sepparate server) but now we implemented caching it's still slow.
Is there a good way to find out why it takes so long to do the curl requests.
And what would be a good way to go?
Could use a browser rest client to point to your API and use a profiling tool (xdebug/xhprof) to find the source of the bottleneck.
Possibly make sure the api calls resolve locally and dont go all the way out to the internet before coming back in (but may not shave off much time).
Would recommend starting with the APIs code.
The problem is not the cURL request.
I've been working on sockets, generally in PHP for a while. Currently I have a PHP client for connecting to a chat server, and output every each data sent from server it's connected to.
To explain that in a wider matter, I accomplished this using flush() function in PHP to write out every each buffer waiting in the loop. Buffer reader is withing a while where the condition is the status of the connection socket. But this matters less.
Now to what I want to accomplish. I want to keep socket handling to server side and data from server outputted to client, via AJAX/jQuery. So far, my researches always returned me HTML5 WebSocket and node.js, however, I "have to" be real picky about this, as for users of this, my minimal dependency might be:
WinXP IE6 users(Already disables jQuery, even)
Users without JAVA/Flash installed
So I have to think of possibilities in this, which is why I can't use a Flash/Java backend or a new technology like WebSockets, and neither I want to handle server stuff in the client. I really hate to be stuck in old technology but for this it's a must.
As I was searching around, I found this one being as similiar to my needs.
Is PHP socket a viable option for making PHP jQuery based chat?
And to quick review the answers, they all point to one direction, PHP multi-process and memory eating. I know this is a minus, but it's the best I can take for now. But yet still, there'll be timeout disconnects for inactive connections within a certain delay, and extension of the delay if wanted. So I'm not much onto this one.
Secondly, the last answer pointing to "Ajax Chat Application Tutorial", I made an overall review but whoa, writing each line into an html file and re-including it each time, that is which I could do without using an extra file but, is it really necessary? Plus re-reading the file from server side, and re-importing the whole read file into document every each time, isn't that just worse for "both sides"?
Either ways that's about it, I wasn't able to come to a conclusion for a while, and it happened, here I am again. (:P) Waiting for your answers/suggestions/ideas, thanks by now.
Regards.
There is server software available that specializes in such matters. Is called a push server/service. There's for example APE (http://www.ape-project.org/); according to their website, it's compatible with all web browsers and they even got a demo chat there. I'd suggest you to go for that solution.
A page is sending AJAX call to server and should get item info in response. The array to look-up/return is a rather big one and I can’t hold it in the PHP file to accept the request. So, as far as my knowledge and experience tell, there are 2 methods:
Access database for each request.
Store items in files (e.g. “item12.txt”) and send contents to the user.
My C experience says that opening and closing a file takes much more system time than the rest of the program. How is it in PHP? What is the preferred method (most importantly, resource-wise) – file system or database? Is there any other way you would recommend (e.g. JavaScript directly loading the file with variable array from the server for each request)? Maybe there’s some innovative method lying around you’re aware of?
P.S. On the server-side a number only will be accepted, so no worries regarding someone trying to access files in the server or trying to do some fancy stuff on database.
Sockets
Depending on how many requests you will be handling, you could look into socket connections.
Sockets gives you 2 way communication between the client and the server, which would allow you to do interactive things, as needed.
Socket tutorial 1
Socket tutorial 2
Node.js
node.js is the new kid on the block. You write your own socket webserver, and use javascript to communicate with it. This is a great alternative to Ajax, as it's much more efficient and reliabe.
node.js can be run alongside PHP, and only be used for ajax-like calls.
node.js
node.js socket turotial
There are nothing innovative. If you have low frequency calls to data and you want super simple access to data then use files. But today is much better to use any database (SQL lite) is ok i think. IF you need more performance then use MySQL or NoSQL solutions. Tools made to solve things. Use the right tool for your purpose.
I am developing a non-real time browser RPG game (think Kingdom of Loathing) which would be played from within a Flash app. At first I just wanted to make the communication with server using simply URLLoader to tell PHP what I am doing, and using $_SESSION to store data needed in-between request.
I wonder if it wouldn't be better to base it on a socket connection, an app residing on a server written in Java or Python. The problem is I have never ever written such an app so I have no idea how much I'd have to "shift" my thoughts from simple responding do request (like PHP) to continuously working application. I won't hide I am also concerned about the memory and CPU usage of such Server app, when for example there would be hundreds of users connected. I've done some research.
I have tried to do some research, but thanks to my nil knowledge on the sockets subject I haven't found anything helpful. So, considering the fact I don't need real time data exchange, will it be wise to develop the server side part as socket server, not in plain ol' PHP?
Since your game isn't something that's working in realtime you probably don't need to go down the socket route, though it's certainly a viable option. The nice thing about sockets is that updates would be instant without requiring page refresh (or server poll), so you're right to at least consider it.
If you do want to do a more real-time server setup, you might consider using something like Electroserver - this abstracts out much of the setup for you so you don't have to write your own server from scratch, plus it's free up to a certain number of concurrent users if I recall correctly.
Finally, a third option you have is a modified POST approach using AMF. Look into AMFPHP, it lets you call methods on a PHP back-end directly from your flash application. A little bit faster and easier than simply using POST stuff, but not quite as seamless as a socket connection or a specifically built gaming server.
Lots of options out there, it sounds like you are aware of this and kudos for trying to come up with the best approach rather than just rolling with what you know! I hope this helps, let me know if you have any questions.
Here's a link to Electroserver - http://www.electro-server.com/
I'm trying to find a efficient way to watch the server log on a webpage, i don't mind building an app i just can't work out the best way to do it.
Is there a way to keep a stream open to a file with php and to the browser? or will it have to be done by polling the file every x seconds?
Thanks in advance,
Shadi
The best solution is definitely AJAX in some capacity. The only way to have the server "push" to you the way you describe (maintain an open stream) would require the HTTP connection to remain open which would ultimately trigger timeouts and consume a lot of resources. I would look into the Cometd library. The downside to this is that I believe it depends on Java although the site does mention perl, python and "other languages." In the worst case, you could use a specific jetty implementation just for log monitoring on a specific port. Regardless, that framework would most likely be your best bet.
Any web-based chat mechanism essentially uses a push architecture and would be good to look at for some inspiration. In this case, instead of users creating messages that are fired to other users, the server creates the events (when a log message is generated). Check out this article on Facebook chat for some insight into how they do it. Google chat might be worth looking into if you can find some stuff on the architecture.
For the actual logging, I'm not sure if you are in need of help for that, but log4php which is currently under incubation might be a good place to start as it provides you with a configuration that can simultaneously log to an arbitrary number of "loggers" like database, file, socket, etc. You could likely find one that would allow you to tie it into whatever push framework you elect to use.
Good luck!
Remember that the web model is essentially stateless (disconnected). Having that in mind when a client submits a request, the server processes the request and then send a response accordingly. You can have track of the clients action using cookies and/or sessions, but the resources reserved for a request are released after the response is submitted back.
I think that the best way to meet your goal, is to develop a web services that checks for the status of the log and fetch the diff (if any). Your app may consist of a web page with a div that will display the diff from the web service.
A script with a timer will trigger the call to the web service.
I will try to do something like this in a few weeks, and I will post the entire solution on moropo blog (spanish). You can ask for a post translation using the comments.
The best way to do it is to use AJAX to pull the file content every x seconds, giving the illusion of real time.
If you do want real time, you can use an XMPP server, but from what I can see, the first solution is far sufficient and does't require a lot of work.
Try wonlog.
https://www.npmjs.com/package/wonlog
You can stream multiple log files to a web browser.