Keep PHP file always running in EC2 - php

I have a server.php file in my elastic beanstalk website that running on an ec2 instance, it creates a websocket and keep it alive with an infinite loop (takes messages and send them correct client).
But after the deployment server.php file never starts run until I open it on my browser and I am not sure if it keeps running on.
I don't know the right way to do this. If it's the correct way how can I get server.php to open after deployment and keep running always.

Use supervisord. That's commonly used by laravel (php) to keep workers running. It's quite comfortable and has nice features that can be enabled, such as detection if a script did not successfully start, retries, automatic restarts, delayed start and some more quality of life stuff.
There appear some tutorials link
and link

Related

PHP WebSocket Servers

I want to set up a WebSocket server using PHP. I have many alternatives to do this, yes, but I wanted to ask people who have experienced which one is more reliable (strong, lightweight, and faultless). I also wrote some code, this code creates a socket server, but I'm not sure how to start it, do I need to open the page from the browser?
Usually a web socket server, as many servers, is started to operate as a daemon in the background and listen there for events coming in. On Linux (Ubuntu) you might create a unit file to be consumed by the system command systemctl, this way the daemon will be started with every boot of the system and you can start and stop it as you need to.
Next you can start it form a command shell like
nohup php websocketserver &
This will send your server into the background an stay there until the system is rebooted. Any output is logged to a file named nohup.
On Windows you better create a service, via a schedule task to have your server start at system boot.

(having the process run until completion, after browser is closed)

I have a website, created using PHP and running on Apache. I want a subscriber to be able to log in and start a process on the server. They can then log out or close the browser without interrupting the process. Later they can log in and see the progress or see the results of the original process. What is the best way to accomplish this (having the process run until completion, after the browser is closed)?
Just looking for someone to point me in the right direction. A few people mentioned Gearman.
Gearman would be an ideal candidate, and I would use it for exactly the purpose you describe. It has everything you need out of the box to meet your requirements ("background" a long running, CPU-bound process to another machine, e.g. video encoding).
There is a Gearman PHP library, but you can write your worker code in a different language if it's better suited to doing the work.
For reporting progress information, I recommend having the worker write to Redis or Memcached - some kind of temporary storage that your web server can also access.
Check out the simple PHP example on the Gearman site. For learning, I recommend setting up a lab environment that contains 3 separate VM's, one for your web server (the client), one for the Gearman job queue (the server) and another for processing jobs (the workers).

Reloading multiple application servers on deploy

I have a setup where there are several application servers running php-fpm service and they all share a GlusterFS mount for the application code and other assets. In the current deploy process, the files get updated directly on the file server and many times to reflect changes the application service must be reloaded. To achieve that, the deployment script needs to get into every server and issue a reload command but with autoscaling, the number of servers is not the same at every moment.
Overall, I am working on sketching a couple of alternatives to solution this problem:
First one, more artesanal and not perfect, as a proof of concept, would be a cron job that will run every X minutes on the application machines and look for a file that should contain a unique info like it's hostname or IP address. If it matches, it will not take action but if not, it will reload and write itself within the file. On the deployment procedure, the script would clear the file and all servers should get reloaded in the next cron run.
Second, using a more sophisticated approach like a message queue or notification service where the running applications machine would subscribe to at boot time and wait for an order to reload. Deploy script would then publish a notification to get all servers aware it is time. A similar cron job from the previous method would then notice that and reload the app server.
Would any of that make sense? Is there any simpler or more standard way to trigger a broadcast for the applications servers running at a given moment in the deploy procedure without having to ssh to each and issuing the reload command? Any other advice you can provide or other suggestions?
Thanks!

What exactly entails setting up a PHP Websocket Server?

I'm getting into Web Sockets now and have been successfully using the online websockets Pusher(didn't like it) and Scribble(amazing but downtime is too frequent since it's just one person running it).
I've followed this tutorial http://www.flynsarmy.com/2012/02/php-websocket-chat-application-2-0/ on my localhost and it works great!
What I wanted to ask is, how do I setup the server.php from the above file to run as a websocket server on an online webhost/shared server?
Or do I need to get a VPS (and if so, which one do you recommend and how can I setup the websocket server there as I've never really used a VPS before!)
Thank you very much for reading my question and answering. I've read all other question/answers here regarding sockets but haven't been able to find the answer to my above questions yet. Hopefully I find it here!
This is tricky.
You need to execute the server.php script and it needs to never exit. If you have an SSH access to your shared server, you could execute it just like they do on the screenshot and make it run as a background task using something like nohup:
$ nohup php server.php
nohup: ignoring input and appending output to `nohup.out'
After invoking this (using the SSH connection), you may exit and the process will continue running. Everything the script prints will be stored into nohup.out, which you can read at any time.
If you don't have an SSH access, and the only way to actually execute a PHP script is through Apache as the result of a page request, then you could simply go to that page using a browser and never close the browser. But there will be a time out one day or another and the connection between you and Apache will close, effectively stopping the server.php script execution.
And in those previous cases, a lot of shared hosts will not permit a script to run indefinitely. You will notice that there's this line in server.php:
set_time_limit(0);
This tells PHP that there's no time limit. If the host made PHP run in safe mode (which a lot of them do), then you cannot use set_time_limit and the time limit is probably 30 seconds or even less.
So yes, a VPS is probably your best bet. Now, I don't own one myself, and I don't know what's a good/bad price, but I'd say HostGator seems fine.

apns running php script in distribution mode on live server

every one i have studied and implemented these tutorials of ray
http://www.raywenderlich.com/3443/apple-push-notification-services-tutorial-part-12
http://www.raywenderlich.com/3525/apple-push-notification-services-tutorial-part-2
i have implemented apns on local server, now i want to do it on live server, my question is that the script "push.php", which we are running on local server by using this
/Applications/MAMP/bin/php5.2/bin/php push.php development
how can we run it on live server in production mode, did we have to ask the domain providers (we are using Host Gator Services) to run this script for us or as ray says
"However, on your production server you should start the script as follows:
$ /Applications/MAMP/bin/php5.2/bin/php push.php production &
The “&” will detach the script from the shell and put it in the background."
means we will use command line interface to run that script on live server?, i am little confused because on server side we use cron jobs to execute the scripts, but this "push.php" should never exit, so i am confused here, what to do. Plz. guide me in this, thanx in advancs. Regards Saad
Yes, the command line interface should be used to run the PHP script and keep it running in the background.
However, as you are on a shared hosting service, I doubt they will let you run PHP continuously.
You may want to try asking them if it is possible; if it is not, just edit the PHP script you cited so that, instead of opening the connection at the beginning and continue running, every time it is invoked it opens a connection to the Apple server, sends the message, closes the connection and exits. Although this is not encouraged by Apple themselves, this would allow your script to be invoked only when it is necessary by the Web server (so that no continuous running is required).

Categories