First and foremost: please not that this post isn't about Ajax/PHP status updating. It's pure PHP, my script needs to be able to run in console. Be gentle with that dup' question. Thank you.
I have a web application that starts multiple long running operations on multiple remote servers. The output is in the same format for all those scripts. My master script needs to retrieve the status of all those scripts and then process them and output the result.
So far, I've tried:
Start the remote process then poll results from a check_status.php, who shares the status array via transparent session variables. I found it to be too error/failure prone, and couldn't have a really solid implementation of it.
Stream the output of the server, and find each time the last status JSON string and use it. But this also felt, well, clumsy.
I'm sure I'm missing something. So I'm asking for your help.
My solution requires:
No additional packages to be installed. (e.g PEAR, PECL....)
Works on PHP 5.3.
Preferably, (but not necessary to post a solution, I'd be just grateful) using rmccue/Requests.
Related
I'm building small application to collect data. For data collection PHP is used, for data storage PostgreSQL is used. PostgreSQL is included so I have full control over it. The PHP for collection is triggered by external entity and I have no control over PHP interpreter that will run the code.
Is there a way how to load php_pgsql.dll? at run-time?
I know it was asked already, for example here, here and my best source of information was here. If I get it right there is no way if I'm not root of the system (because dl() was removed).
I can add PHP to my application the same way I have added PostgreSQL (to have control over PostgreSQL and do not need to ask someone to install, configure, maintain...), BUT my PHP files are triggered by external application so I have no control over used PHP interpreter/environment.
Is there a way to start from PHP code (let's call it systemPHP) the same PHP code but in different PHP environment (myPHP environment I have control over and where I will have the dll included)?
For example if systemPHP starts collect.php the pseudo code of collect.php will be:
if <this is myPHP> { # How to detect it?
<execute the data collection code>
}
else {
<Start collect.php in myPHP transfering all the data to it> # For example if started by apache then also headers, session information etc...
<Send back result from myPHP via the systemPHP>
}
How to achieve this PHP 'tunnel'?
Thanks for any help or hint. I know that best will be root or at least have intelligent admin, however this is not the case :-(
Currently I'm trying workaround by executing database tasks via shell and then getting response back in PHP, but sometimes it works sometimes not and I believe there is a better way of doing this (not to mention speed and resource usage).
Have you looked into using a messaging queue system? Write to the queue, then have your PHP script running that has php_pgsql.dll already loaded, which checks for new messages in the queue and processes them.
Ok, so I'm in the starting stage of a new project where I have an apache web server with PHP included and a MySQL database.
The main focus aim of this project is to show data in this MySQL database as real time on the web page. The problem I have is I am not allowed to install any new software on the server, so I cannot use nodejs or socket.io
I've been looking at the PHP long polling possibility, but I'm curious if anyone out there has managed to pull off something similar without grinding their server to a halt due to too many threads being used.
I've heard about comet, but not sure how that would work as from reading it seems to just look at flat files, not databases.
Thanks for any help.
This is easily achievable with jquery and php, create a php file and echo json encoded data in return. Usage can be found here: jquery post
I am working on a project where I am taking a URL for a page and finding an image to represent the page. After looking at the metadata, I look to the page content and start scraping the page. This is a PHP project.
Using PHP, the getimagesize() function has a long timeout and if there are lots of images on the page, it becomes a very slow process.
I went with a solution that is a little odd, but very fast. I am looking to see if there is a better solution or if there are any glaring problems with my solution.
--- Start Solution ---
In PHP, build an array of image URLs. Connect to a socket, handled by Node.js, and pass the array of URLs to the Node.js handler. Meanwhile, have a blocking read on the PHP side, waiting for a response.
In Node.js, I get the image size / type of the images from the URLs using the 'imagesize' module. When all the jobs are finished, write the URL of the largest image to the socket. Any error will also write to the socket before closing the socket. This is run as a daemon with the 'forever' module.
The PHP side unblocks after it reads from the socket.
--- End Solution ---
Is this an acceptable way to solve this problem? It works and on some poorly formed blogspot pages I saw 10x performance improvements. I am aware there are some threading solutions in PHP, but I am worried about the sheer number of threads for some pages.
Update:
The requirement is that this give immediate feedback, so cron jobs or queues don't fit the requirement.
If you see Performance improvements then it's acceptable way to solve your problem. I would suggest on trying shell_exec function to run your node.js process. you can pass urls as command line arguments which you can access with process.argv this way you will not need to keep you node.js process always running. Run it only when needed through php.
I've been working on sockets, generally in PHP for a while. Currently I have a PHP client for connecting to a chat server, and output every each data sent from server it's connected to.
To explain that in a wider matter, I accomplished this using flush() function in PHP to write out every each buffer waiting in the loop. Buffer reader is withing a while where the condition is the status of the connection socket. But this matters less.
Now to what I want to accomplish. I want to keep socket handling to server side and data from server outputted to client, via AJAX/jQuery. So far, my researches always returned me HTML5 WebSocket and node.js, however, I "have to" be real picky about this, as for users of this, my minimal dependency might be:
WinXP IE6 users(Already disables jQuery, even)
Users without JAVA/Flash installed
So I have to think of possibilities in this, which is why I can't use a Flash/Java backend or a new technology like WebSockets, and neither I want to handle server stuff in the client. I really hate to be stuck in old technology but for this it's a must.
As I was searching around, I found this one being as similiar to my needs.
Is PHP socket a viable option for making PHP jQuery based chat?
And to quick review the answers, they all point to one direction, PHP multi-process and memory eating. I know this is a minus, but it's the best I can take for now. But yet still, there'll be timeout disconnects for inactive connections within a certain delay, and extension of the delay if wanted. So I'm not much onto this one.
Secondly, the last answer pointing to "Ajax Chat Application Tutorial", I made an overall review but whoa, writing each line into an html file and re-including it each time, that is which I could do without using an extra file but, is it really necessary? Plus re-reading the file from server side, and re-importing the whole read file into document every each time, isn't that just worse for "both sides"?
Either ways that's about it, I wasn't able to come to a conclusion for a while, and it happened, here I am again. (:P) Waiting for your answers/suggestions/ideas, thanks by now.
Regards.
There is server software available that specializes in such matters. Is called a push server/service. There's for example APE (http://www.ape-project.org/); according to their website, it's compatible with all web browsers and they even got a demo chat there. I'd suggest you to go for that solution.
I've got a website that receives Posted XML data from a third party.
I'm looking for a method so I can batch post a number of XML files to this script for development/debugging purposes.
I have built a php script that loops through an array of files and uses curl to post each file separately. However due to the number of files i wish to post, I feel PHP isn't the best method as the script times out.
Ideally, I'm looking for a terminal process/os x application that will pick up all files in a given directory and post the contents of each one to a defined URL one by one.
Any suggestions/ideas greatly received.
Jim.
I'm a little confused. Why not just set the CURLOPT_TIMEOUT option appropriately on your posting application ? Otherwise your PHP solution seems to be functioning ok ?
If you don't want a PHP solution, have you looked at posting via HTTP in parallel, and use some scripting language with threading support (e.g. ruby or similar). The only headache is that you're now going to be loading your server more for the benefit of your script running faster, and you need to determine what sort of load your server-side process can handle.