I have a website in Symfony 3, which runs a script on a remote server UNIX.
I would like to know the progress(steps) of the script in real time or every X seconds, on the website without the user needing to reload the page.
What solution should be considered?
Edit - About the script:
When a user clicks a button, Symfony run a shell script that is present on the same server as my web application. The shell script connects to SSH on a remote server (user) and installs components on the remote server.
The shell script is quite long and complex.
There are 2 possible solutions:
You can do an interval that checks every x milliseconds for any changes and will apply those changes to your front-end.
The other option is to set up using web sockets and would recommend this over interval checking. It maintains an active connection to the server so it is near instant if not instant.
I would suggest checking out this bundle: https://github.com/GeniusesOfSymfony/WebSocketBundle
I don't actually understand how
which runs a script on a remote server UNIX
you are doing this but this is the solution you can use.
Have a symfony command that listens on remote service/server/app etc.
Have a symfony controller that streams the response from the console commands and echos the result in twig template as they arrive.
With this way, users don't need to refresh the pages. Wheel has already been invented here Streaming console command output from symfony controller for you. You just need to update the command as per your requirements.
Note: If you don't want console command approach, this is commandless version. Streaming symfony response from twig template with XMLHttpRequest
Related
I want to set up a WebSocket server using PHP. I have many alternatives to do this, yes, but I wanted to ask people who have experienced which one is more reliable (strong, lightweight, and faultless). I also wrote some code, this code creates a socket server, but I'm not sure how to start it, do I need to open the page from the browser?
Usually a web socket server, as many servers, is started to operate as a daemon in the background and listen there for events coming in. On Linux (Ubuntu) you might create a unit file to be consumed by the system command systemctl, this way the daemon will be started with every boot of the system and you can start and stop it as you need to.
Next you can start it form a command shell like
nohup php websocketserver &
This will send your server into the background an stay there until the system is rebooted. Any output is logged to a file named nohup.
On Windows you better create a service, via a schedule task to have your server start at system boot.
I have a setup where there are several application servers running php-fpm service and they all share a GlusterFS mount for the application code and other assets. In the current deploy process, the files get updated directly on the file server and many times to reflect changes the application service must be reloaded. To achieve that, the deployment script needs to get into every server and issue a reload command but with autoscaling, the number of servers is not the same at every moment.
Overall, I am working on sketching a couple of alternatives to solution this problem:
First one, more artesanal and not perfect, as a proof of concept, would be a cron job that will run every X minutes on the application machines and look for a file that should contain a unique info like it's hostname or IP address. If it matches, it will not take action but if not, it will reload and write itself within the file. On the deployment procedure, the script would clear the file and all servers should get reloaded in the next cron run.
Second, using a more sophisticated approach like a message queue or notification service where the running applications machine would subscribe to at boot time and wait for an order to reload. Deploy script would then publish a notification to get all servers aware it is time. A similar cron job from the previous method would then notice that and reload the app server.
Would any of that make sense? Is there any simpler or more standard way to trigger a broadcast for the applications servers running at a given moment in the deploy procedure without having to ssh to each and issuing the reload command? Any other advice you can provide or other suggestions?
Thanks!
I have tried reading and have understood PHP console to be a command-line interface (CLI) like one used in composer. I do not understand the difference between a web script and a console script. I do not see the use of having the two.
I want to crawl data from a certain link. Should I use a console script or web script and why?
Please explain in the simplest manner possible.
There is no difference between the two. In most instances, the same PHP script will run whether you execute it from the command line or via the web.
There is, however, a difference between the environment the script will execute within. A CLI script is initiated from and executed within your shell on your computer. It is very self-contained. A web script, on the other hand, is (typically) initiated via a HTTP request from a browser, passes over the web to a web server, is executed on that remote server and a result (typically a web page) is passed back to your browser. In the latter case, there are special environment variables related to the web request made available to the script.
It's a bit hard to know which is the best case for your web crawler script without knowing more detail. But I'd say a command line script is what you're after.
One difference between a web page and a CLI instance is the way the script is executed: webpages will be loaded via a web container, while CLI's will be usually executed by the shell used to launch the PHP. Due to this, a CLI might not have access to all $_SERVER variables as the webpage as practically there is no HTTP request involved.
CLI scripts are useful for doing background tasks that are not initiated by the web server, for example a cron job that periodically cleans your database, on one that executes queued jobs. Think of CLI as shell scripts, you can write a PHP script instead of a bash one.
The PHP interpreter is the same in both cases, and it's up to you to decide which one suits best your needs: webpages are more common, however if you need to have you server do some work without waiting for a web request, the you can go with CLI.
Well, basically a console script is the way for your task.
The difference resides in the fact a Webscript will block your browser, will not show your progress real-time, etc.
I was able to crawl and download about 6000 images from my beloved anime with a console script, showing the progress status, something harder with a Web script as the browser will cache the output. Also you can chain your script and also make some cron magic(assuming you are on nix box)
is there a way how to easily run a PHP application as from command line on Windows Azure?
I have a standard Web Application (on Azure) and I want to communicate using WebSockets.
So I need to have a WebSocket Server running all the time on Azure.
I use Wrench project which I need to run "all the time" to listen on some port and deal with messages from JavaScript-sended WebSocket.
So again - how easily run a "persistent" PHP application on Azure?
Thank you in advance.
Sandrino's answer is fine, but I prefer ProgramEntryPoint for doing this sort of thing. The trouble with a background task is that (unless you build something on your own) nothing is monitoring it. Using ProgramEntryPoint, Windows Azure will monitor the process, and if it exits for any reason, the role instance will be restarted.
EDIT:
Sandrino points out that the PHP program isn't the only thing running. (There's also a website.) In that case, I'd recommend launching php.exe in Run() in WebRole.cs. Process.Start it and then do a .WaitForExit() on it. That way, if the process exits, the role itself will exit from Run(), causing the role instance to restart. See http://blog.smarx.com/posts/using-other-web-servers-on-windows-azure for an example.
In order to run your PHP script as a command line application you should use the PHP CLI (command line interface).
php.exe -f "yourWebSocketServce.php" -- -arg1 -arg2 -arg3
Now, in order to run this in Windows Azure you'll need to define a startup task that runs this command. You'll see that the default task type is simple, which means that the startup of your role will block until the task finishes. But in your case running the WebSocket in PHP will be a blocking process, that's why you should change the type to background (this will make sure the instance continues starting up while your WebSocket server is running).
Here is a WebSockets service on Azure. - Live XSockets.NET
Have a look at http://live.xsockets.net, an easy way of getting started, but it depends on what you are about to do on the "server side". This service i mention can be uses as a "message" dispatcher, to ntify "clients" on changes etc.. Hmm in other words it is a way of boosting "regular" web-apps..
I have a function to import data from excel to database, I make this function to run on server so this function doesn't need to interact with client anymore, the client web browser just need to upload the excel file to server, after that, the task will be run just on server so if the browser closed by client, the function still run on server, i've got this, the problem is, when the browser is leave open by client, the browser will loading as long as the function still active.How can i made the browser not wait respond from server so the browser will not loading while the process is run on server.Please help me.
Use a message queue to offload the task of processing the file from the web server to another daemon running separately.
You can take the cheap and easy route of execing a process with & in the command line, causing it to be backgrounded. However, that gives you little control / status.
The right way to go about it IMO is to queue up these long-running tasks in a database, with some status info associated with them. Then have a dedicated process which runs separate from your webserver, checking the database for tasks, and performs them, updating the database with success/failure status.
Look into using a queue such as Mseven's Queue Plugin:
Msevens Queue Plugin
Or, if you want a more daemon based job, look into Beanstalkd. The queue plugin by mseven is pretty self explanatry though. Stay away from forking processes using &, it can get out of control.