I am really sorry for my Poor English.
I have created simple websocket game with voryx/ThruwayBundle for Symfony. Game uses RPCS registered on Server. Everything works fine but when I leave for about 20 mins RPCS are no longer available. And I have to restart websocket server to make them avaiable again.
I tried to register my rpcs as workers and I can see them running but they are still unavailable
websocket server process status
The annotation I use to register RPC is
/**
* #Register("games.snake.newplayer",serializerEnableMaxDepthChecks=true, worker="add-snake")
*/
I run server with command
nohup php app/console thruway:process start &
You can see it on http://amusement.cloudapp.net/
I am using Ubuntu 15.10 server created in Microsoft azure if it's any help
I don't know what I can do to make those RPC available anytime without restarting websocket server. Should I make some cron action to reset websocket server if they're stopped responding and how can I do it.
Edit#1
RPCS work great on my local machine Ubuntu 14.04
To prevent disapeearing of rpcs I created symfony console command to ping them with some test data. Next I registered this command as cron job to be executed every minute.
I could't find the source of problem however it's easy way to avoid it.
Related
I have a website in Symfony 3, which runs a script on a remote server UNIX.
I would like to know the progress(steps) of the script in real time or every X seconds, on the website without the user needing to reload the page.
What solution should be considered?
Edit - About the script:
When a user clicks a button, Symfony run a shell script that is present on the same server as my web application. The shell script connects to SSH on a remote server (user) and installs components on the remote server.
The shell script is quite long and complex.
There are 2 possible solutions:
You can do an interval that checks every x milliseconds for any changes and will apply those changes to your front-end.
The other option is to set up using web sockets and would recommend this over interval checking. It maintains an active connection to the server so it is near instant if not instant.
I would suggest checking out this bundle: https://github.com/GeniusesOfSymfony/WebSocketBundle
I don't actually understand how
which runs a script on a remote server UNIX
you are doing this but this is the solution you can use.
Have a symfony command that listens on remote service/server/app etc.
Have a symfony controller that streams the response from the console commands and echos the result in twig template as they arrive.
With this way, users don't need to refresh the pages. Wheel has already been invented here Streaming console command output from symfony controller for you. You just need to update the command as per your requirements.
Note: If you don't want console command approach, this is commandless version. Streaming symfony response from twig template with XMLHttpRequest
I feel a little bit silly for asking this question but I can't seem to find an answer on the internet for this problem. After searching for several hours I figured out that on a linux server you use Supervisor to run "php artisan queue:listen" (either with or without daemon) continuously on your website to handle jobs pushed to the queue. This is all well and good, but what if I want to do this on a Windows Azure web app? After searching around the solutions I found were:
Make a chron job to run "php artisan queue:listen" every minute (or every X minutes), I really dislike this solution and wanted to avoid it specially if the site gets more traffic;
Add a WebJob that runs "php artisan queue:listen" continuously (the problem here is I don't know how to write the script for the WebJob...);
I want to ask you guys for help on to know which of these is the correct solution, if there is a better one and if the WebJob is the best one how do I write the script for this? Thanks in advance.
In short, Supervisor is a modern alternative to nohup (no hang up) with a few other bits and pieces tacked on. In short, there's other resources that can keep a task running in the background (daemon) and the solution I use for Windows based projects (very few tbh) is Forever which I discovered via: https://stackoverflow.com/a/18226392/5912664
C:\myprojectroot > forever -c php artisan queue:listen --queue=some_nice_queue --tries=3
How?
Install node for Windows, then with npm install Forever
C:\myprojectroot > npm install -g forever
If you're stuck for getting Node running on Windows, I recommend the Windows Package Manager, Chocolatey
https://chocolatey.org/packages?q=node
Be sure to check for any logfiles that Forever creates, as I had left one long enough to consume 30Gb of disk space!
For Azure you can make a new webjob to your web app, and upload a .cmd file including a command like this.
php %HOME%\site\wwwroot\artisan queue:work --daemon
and defining that as a triguered and 0 * * * * * frequency cron.
that way work for me.
best.
First of all you cannot use a WebJob with Laravel on Azure. The Azure PHP Web App is hosted on Linux. WebJobs do not work with Linux at this moment.
The best way to do chron jobs in Laravel on Azure is to create an Azure Logic App. You use the Recurrence trigger and then a HTTP action to send a POST request to your Laravel Web App. You use this periodic heartbeat to run whatever actions you need to do. Be sure to add authentication to your POST request.
The next problem you will have is that POST will be synchronous so the work you are doing cannot be extensive or your HTTP request will time out or you will reach the time limit on PHP scripts (60 seconds).
The solution is not Laravel Jobs because here again you need something running in the background to process the queues.
The solution is also not PHP threads. The standard Azure PHP Web App does not support PHP Threads. You can of course build your own Web App and enable PHP threads, but this is really swimming upstream.
You simply have to live with synchronous logic. So the work you are doing with the heartbeat should take no more than about 60 seconds.
If you need more extensive processing then you really need to off load it to another place: another Web App, an Azure Function, etc.
But why not do that in the first place? The reason is cost and complexity. If you have something simple...like a daily report...you simply connect the report to the heartbeat and all the facilities for producing the report are right there in Laravel. To separate the daily report into its own container would require setup and the Web App it runs in would incur costs...not worth it in my view for something simple.
While reading the Brain Socket's documentation I read the following:
Any changes to your laravel app / code while the ws server is running are not taken into account. You need to restart the ws server to see any of your changes.
I understood why that's needed, but since it quite inconvenient to CTRL+C and executing php artisan brainsocket:start every time, what would be a convenient way to restart the websocket server?
is there a way how to easily run a PHP application as from command line on Windows Azure?
I have a standard Web Application (on Azure) and I want to communicate using WebSockets.
So I need to have a WebSocket Server running all the time on Azure.
I use Wrench project which I need to run "all the time" to listen on some port and deal with messages from JavaScript-sended WebSocket.
So again - how easily run a "persistent" PHP application on Azure?
Thank you in advance.
Sandrino's answer is fine, but I prefer ProgramEntryPoint for doing this sort of thing. The trouble with a background task is that (unless you build something on your own) nothing is monitoring it. Using ProgramEntryPoint, Windows Azure will monitor the process, and if it exits for any reason, the role instance will be restarted.
EDIT:
Sandrino points out that the PHP program isn't the only thing running. (There's also a website.) In that case, I'd recommend launching php.exe in Run() in WebRole.cs. Process.Start it and then do a .WaitForExit() on it. That way, if the process exits, the role itself will exit from Run(), causing the role instance to restart. See http://blog.smarx.com/posts/using-other-web-servers-on-windows-azure for an example.
In order to run your PHP script as a command line application you should use the PHP CLI (command line interface).
php.exe -f "yourWebSocketServce.php" -- -arg1 -arg2 -arg3
Now, in order to run this in Windows Azure you'll need to define a startup task that runs this command. You'll see that the default task type is simple, which means that the startup of your role will block until the task finishes. But in your case running the WebSocket in PHP will be a blocking process, that's why you should change the type to background (this will make sure the instance continues starting up while your WebSocket server is running).
Here is a WebSockets service on Azure. - Live XSockets.NET
Have a look at http://live.xsockets.net, an easy way of getting started, but it depends on what you are about to do on the "server side". This service i mention can be uses as a "message" dispatcher, to ntify "clients" on changes etc.. Hmm in other words it is a way of boosting "regular" web-apps..
every one i have studied and implemented these tutorials of ray
http://www.raywenderlich.com/3443/apple-push-notification-services-tutorial-part-12
http://www.raywenderlich.com/3525/apple-push-notification-services-tutorial-part-2
i have implemented apns on local server, now i want to do it on live server, my question is that the script "push.php", which we are running on local server by using this
/Applications/MAMP/bin/php5.2/bin/php push.php development
how can we run it on live server in production mode, did we have to ask the domain providers (we are using Host Gator Services) to run this script for us or as ray says
"However, on your production server you should start the script as follows:
$ /Applications/MAMP/bin/php5.2/bin/php push.php production &
The “&” will detach the script from the shell and put it in the background."
means we will use command line interface to run that script on live server?, i am little confused because on server side we use cron jobs to execute the scripts, but this "push.php" should never exit, so i am confused here, what to do. Plz. guide me in this, thanx in advancs. Regards Saad
Yes, the command line interface should be used to run the PHP script and keep it running in the background.
However, as you are on a shared hosting service, I doubt they will let you run PHP continuously.
You may want to try asking them if it is possible; if it is not, just edit the PHP script you cited so that, instead of opening the connection at the beginning and continue running, every time it is invoked it opens a connection to the Apple server, sends the message, closes the connection and exits. Although this is not encouraged by Apple themselves, this would allow your script to be invoked only when it is necessary by the Web server (so that no continuous running is required).