I'm getting into Web Sockets now and have been successfully using the online websockets Pusher(didn't like it) and Scribble(amazing but downtime is too frequent since it's just one person running it).
I've followed this tutorial http://www.flynsarmy.com/2012/02/php-websocket-chat-application-2-0/ on my localhost and it works great!
What I wanted to ask is, how do I setup the server.php from the above file to run as a websocket server on an online webhost/shared server?
Or do I need to get a VPS (and if so, which one do you recommend and how can I setup the websocket server there as I've never really used a VPS before!)
Thank you very much for reading my question and answering. I've read all other question/answers here regarding sockets but haven't been able to find the answer to my above questions yet. Hopefully I find it here!
This is tricky.
You need to execute the server.php script and it needs to never exit. If you have an SSH access to your shared server, you could execute it just like they do on the screenshot and make it run as a background task using something like nohup:
$ nohup php server.php
nohup: ignoring input and appending output to `nohup.out'
After invoking this (using the SSH connection), you may exit and the process will continue running. Everything the script prints will be stored into nohup.out, which you can read at any time.
If you don't have an SSH access, and the only way to actually execute a PHP script is through Apache as the result of a page request, then you could simply go to that page using a browser and never close the browser. But there will be a time out one day or another and the connection between you and Apache will close, effectively stopping the server.php script execution.
And in those previous cases, a lot of shared hosts will not permit a script to run indefinitely. You will notice that there's this line in server.php:
set_time_limit(0);
This tells PHP that there's no time limit. If the host made PHP run in safe mode (which a lot of them do), then you cannot use set_time_limit and the time limit is probably 30 seconds or even less.
So yes, a VPS is probably your best bet. Now, I don't own one myself, and I don't know what's a good/bad price, but I'd say HostGator seems fine.
Related
I have a server.php file in my elastic beanstalk website that running on an ec2 instance, it creates a websocket and keep it alive with an infinite loop (takes messages and send them correct client).
But after the deployment server.php file never starts run until I open it on my browser and I am not sure if it keeps running on.
I don't know the right way to do this. If it's the correct way how can I get server.php to open after deployment and keep running always.
Use supervisord. That's commonly used by laravel (php) to keep workers running. It's quite comfortable and has nice features that can be enabled, such as detection if a script did not successfully start, retries, automatic restarts, delayed start and some more quality of life stuff.
There appear some tutorials link
and link
I am not sure on any of the technical terms, but I think I am describing this in an understandable way.
I am going to write a php script which will listen on a port, and send out data as necessary, i.e. a basic game listening process. There are many places on the web with hints about this, so I think I will be alright there, but any further suggestions on tutorials are welcome.
That is not my question. This is:
What I want to do is have a server where this script is NOT automatically running. What I want instead is when someone fires up the game on their client, it will do a test to see if there is a listening script running already on my server, and if not it will launch it. (I am not sure how to do that yet, but I don't expect that to be too hard. Just send some data to that port and see if there is a response.)
Then when other people join, they will see the script running on the server and use that rather than launching their own.
Are there problems with this idea?
If the first person quits, will the script close down?
Would it be better to let the 'first player' launch the script and put a requirement on the script that it does not shut down. (I.e. the script will run forever until the server gets a restart - basically my server does not mind me running long running scripts, as long as they don't use too much processing, but they don’t allow me to have scripts to start-up if the server is relaunched.)
Would this run-forever script avoid any 'first-player' shutdown problem?
If two people are deemed the 'first player' when they do a check to see if the script is running, but then one of those people launch the script milliseconds before the other, surely this will create two scripts on the server, causing a kind of echo effect for listening. How would I overcome this?
Jon
I have a php program that does extensive curl requests to scrape web pages. It could be up to a million requests. I need to completely stop the script from running. Even though I stopped it in my browser, it is still processing requests. How can I stop it permanently?
You are just killing the request, you will need to stop apache to stop it for now. In the future redesign it so that the process looks for a kill switch (like the presence of a file) and stops processing if it finds it. Sounds like you are jamming a long running process into a php script, why not run it as a normal system process directly?
Assuming you are running the typical lamp stack, SSH into your machine, if necessary, and restart Apache.
If you are really going to perform long running tasks with PHP, I must suggest you consider using cron to run them or implement a task queue of some sort. It's generally a really bad idea to have these sort of things fired from a browser request.
Restart Apache. If you're using XAMP, stop and start it from the control panel.
If not, on Windows, go to task manager and end the apache.exe process. Then start it again.
Why the hell is everyone assuming you're running Apache? Restart your web server and it should be dandy. In the future, you could have a kill switch like (example):
while(!file_exists('stop.txt'))
Then just make that file when you're ready to stop ^.^ Or have a finite number of iterations before cutting off.
I'm trying to build a web interface for some python scripts. The thing is I have to use PHP (and not CGI) and some of the scripts I execute take quite some time to finish: 5-10 minutes. Is it possible for PHP to communicate with the scripts and display some sort of progress status? This should allow the user to use the webpage as the task runs and display some status in the meantime or just a message when it's done.
Currently using exec() and on completion I process the output. The server is running on a Windows machine, so pcntl_fork will not work.
LATER EDIT:
Using another php script to feed the main page information using ajax doesn't seem to work because the server kills it (it reaches max execution time, and I don't really want to increase this unless necessary)
I was thinking about socket based communication but I don't see how is this useful in my case (some hints, maybe?
Thank you
You want inter-process communication. Sockets are the first thing that comes to mind; you'd need to set up a socket to listen for a connection (on the same machine) in PHP and set up a socket to connect to the listening socket in Python and send it its status.
Have a look at this socket programming overview from the Python documentation and the Python socket module's documentation (especially the examples at the end). I'm sure PHP has similar resources.
Once you've got an more specific idea of what you want to build and need help, feel free to ask a new question on StackOverflow (if it isn't already answered).
I think you would have to use a meta refresh and maybe have the python write the status to a file and then have the php read from it.
You could use AJAX as well to make it more dynamic.
Also, probably shouldn't use exec()...that opens up a world of vulnerabilities.
You could use a queuing service like Gearman, with a client in PHP and a worker in Python or vice versa.
Someone has created an example setup here.
https://github.com/dbaltas/gearman-python-worker
Unfortunately my friend, I do believe you'll need to use Sockets as you requested. :( I have little experience working with them, but This Python Tutorial on Sockets/Network Programming may help you get the Python socket interaction you need. (Beau Martinez's links seem promising as well.)
You'd also need to get some PHP socket connections, too, so it can request the status.
Continuing on that, my thoughts would be that your Python script is likely going to run in a loop. Ergo, I'd put the "Check for a status request" check inside the beginning of a part of that loop. It'd reply one status, while a later loop inside that script would reply with an increased status.. etc.
Good luck!
Edit: I think that the file writing recommendation from Thomas Schultz is probably the easiest to implement. The only downside is waiting for the file to be opened-- You'll need to make sure your PHP and Python scripts don't hang or return failure without trying again.
I created a simple xmpp bot in PHP which connects to google talk server. I basically modified the cli_longrun example. When I run the script in browser the bot comes online and stays online for a while even after I close the tab on which the script was running (as it is just an infinite loop listening for events on the stream). But after a while the bot becomes offline.
The question is how do I keep the bot always online. One way I can think of is to run a cron that would disconnect the earlier one and start a new session. But is there a better approch?
Run it from a command line, as long as the script doesn't break it'll stay running as long as the prompt is open.
If this is on a shared host, most likely they have measures in place to prevent a script from running forever even if you have set_time_limit(0) -- so you might be out of luck.
Might also respawn the script with a crontab entry of "#reboot sleep 300; ./runbot.sh" in your crontab if you are allowed cron access.
Run it from the command line and make sure your script doesn't end. Make sure you set_time_limit to 0 to keep it from killing itself.