I have a server.php file in my elastic beanstalk website that running on an ec2 instance, it creates a websocket and keep it alive with an infinite loop (takes messages and send them correct client).
But after the deployment server.php file never starts run until I open it on my browser and I am not sure if it keeps running on.
I don't know the right way to do this. If it's the correct way how can I get server.php to open after deployment and keep running always.
Use supervisord. That's commonly used by laravel (php) to keep workers running. It's quite comfortable and has nice features that can be enabled, such as detection if a script did not successfully start, retries, automatic restarts, delayed start and some more quality of life stuff.
There appear some tutorials link
and link
My Scenario
I'm programatically creating servers and on server creation im firing a command, however it seems like the command is firing too early as the server isn't completely set up so the commands either fail, OR barely firing in time for server completion.
My Question
I want to know if it's safe to implement something like sleep(120) at the start of my script so it gives time for the server to finish setting up before running the command.
I only want this command to run once ever which is why i'm avoiding a database or any other more advance methods.
My script won't time out because I've read scripts ran via terminal don't timeout unless explicitly set.
Im not the most knowledgable at server side logic / best practices so it would be great to know any ins and outs.
Thanks in advance.
I have a weird issue that I've been stuck with for a couple of days now. I'm trying to generate a pdf in a Laravel app using chrome headless with this command
google-chrome --headless --disable-gpu --print-to-pdf=outputfile.pdf http://localurl/pdf-html
The command basically opens chrome in headless mode, navigates to the given url and prints it as pdf saving the file in the specified location. This command is working perfectly when run in my system's shell (I'm using Ubuntu 18.04). Now, my issue arises when trying to run the same command from a Laravel controller, I've tried exec, shell_exec, system and passthru and all give me the same problem. If I run the command without redirecting output and running the process on the backgroung, by adding >> tmpfile 2>&1 & to the end of the command then the request hangs. Running the command in the background would not be a problem normally, except that I need for the command to finish in order to send the file back to the client as a download. By running it on the background this basically executes it asynchrounously and I have no way of knowing when the process ends (or to wait until it ends) to then send the file as a dowload on the response.
I've tried other alternatives to no avail. I've tried using Symfony's Process which comes bundled with Laravel and it also fails. I've tried using puppeteer and instead of running the google-chrome command use a node.js script with code from the puppeteer documentation (which by the way also works when run directly in my system shell), but when run from Laravel throws a Navigation Timeout Error exception.
Finally I created a simple php file with the following code:
<?php
$chromeBinary = 'google-chrome';
$pdfRenderUrl = "http://localhost:8000/pdf-html";
$fileName = 'invoice.pdf';
$outputDirectory = "/path/to/my/file/" . $fileName;
$command = sprintf(
'%s --headless --disable-gpu --print-to-pdf=%s %s',
escapeshellarg($chromeBinary),
escapeshellarg($outputDirectory),
escapeshellarg($pdfRenderUrl)
);
exec( $command );
echo ( file_exists("/path/to/my/file/" . $fileName) ? 'TRUE' : 'FALSE');
?>
And the code runs just fine when run from shell like php thefile.php printing TRUE, meaning the command in exec was launched and after it ended then the file exists; and THAT is the exact code I'm using on Laravel except it only works, as mentioned above, when I send the process to the background.
Can anybody throw me a line here, please? Thanks
EDIT: #namoshek thanks for the quick reply and sorry if I did not made myself clear. The problem is not long waiting times, perhaps I could live with that. The problem is that exec never finishes and I eventually have to forcefully terminate the process (nor exec, nor any other alternative, they all freeze the request completely forever, with the exception of Process which fails by throwing a TimeoutException). I'm using postman to query the endpoint. The frontend is an Angular app, meaning the request for the invoice download will be made asynchronously eventually. Furthermore the task itself is not a long running task, as a matter of facts it finishes pretty quick. Using a polling strategy or a notification system, to me, does not seem like a viable solution. Imagine an app with a download button to download a simple document and you have to click the button and then wait for the app to notify you via email (or some other way) that the document is ready. I could understand it if it were a more complicated process, but a document download seems like something trivial. But what has me at a loss is why is it that running the task from a php script works as I want it to (synchonously) and I can't replicate the behaviour on the laravel controller
EDIT: I've also tried using BrowserShot, which, BTW also fails. Browsershot provides a way to interact, behind the scenes with puppeteer by using Process, and generate a pdf file. And even though it's an external program, it still seems to me that the behaviour I'm getting is not normal, I should be able to obtain the download even if the request took 10secs to finish because it executed the external program synchronously. But in my case it's failing due to a timeout error
EDIT: So after a while I came upon the apparent reason of the server hang up. The problem is that I was using artisan's development server. This, initially, did not seem like a problem to me but it seems that artisan can't handle that load. In the feature I'm implementing I'm performing a request to a particular endpoint, let's call it endpoint 1, to generate the pdf, the code on this endpoint triggers the external command, and when executed synchronously it means the code in endpoint 1 is waiting for the external command to finish. The external command in turn needs to browse to endpoint 2 on the same server, endpoint 2 contains an html view with the content to be put on the pdf, since the server is still waiting on endpoint 1 for the return of the external command then endpoint 2 is unresponsive, which apparently creates a loop which artisan's development server can't handle. Problem is I did a quick search and I found nothing that indicated that defficiency on artisan's development server. I moved the environment to Apache just to test my theory and it worked, though it should be noted that the request takes a very long time to finish (around 10-20 secs). This, so far, seems like the only reasonable explanation as to why that issue was happenning. If anyone knows how I can improve performance on this request, or anyone can provide a better explanation to the original issue I'd appreciate it.
#hrivera I'm a bit late to the game here, but regarding your last edit I believe you're almost correct, but my thoughts on this is that PHP's built-in server, which Laravel uses for development, is single threaded. The issue I had is that any assets within the page that was being passed to Chrome couldn't be loaded (CSS, js, etc) as the thread was already in use, and so it hung. Removing any assets from the HTML fixed the issue.
Production servers are multi-threaded, so we should have no issues. Not entirely sure I'm right, but wanted to comment anyway.
I don't really get what you are asking for, because it seems you already understood that executing a long running task like creating a snapshot will block the request if being run synchronously. Using other software such as Puppeteer will not change that. If your requests needs to wait for the result of this process to return, then the only way to have your request return faster is by speeding up the task itself. Which is most likely not possible.
So, there are basically only two options left: Live with the long wait times (if you want to perform the task synchronously) or execute the request/task asynchronously. The latter can be achieved in two ways:
Make the actual HTTP request be run in the background (using ajax) and use a loading indicator to keep your users patient. This way you could still run the process synchronously, but I would not recommend doing so as you would have to use high timeout times for the ajax request and in some situations the requests would probably still timeout (depending on the workload of your server).
Use the power of Laravel and make use of queue workers to perform the calculation in background. When the snapshot generation is finished, you can then use one of the following three options to return the result:
Use polling on the client side to see if the result is available.
Send the result or a link to the result per mail or something similar to the user.
Use a notification system to notify the user about the finished process and return the result in some way (i.e. fetch it or send it as part of the notification - there are plenty of options available). A built-in notification system that does exactly what I described is Laravel Echo. On receiving the notification that tells you the process finished, you could then fetch the result from the server.
In current times, the standard for web apps and user experience is option 2 with the notification system (3rd point).
I'm currently working on a rather complex PHP5/Symfony 2.8 project. In this project, I have a command that can be called by a Crontab, manually in console, or using a button on the website, that is used to call a webservice on an external site, which gives me an xml file, that I manage to import into my database using SimpleXML.
The command works like a charm on my local dev environment, no matter how it is called.
But for unknown reasons (which is why I'm posting here), on my int or prod environments, which are located on external servers, calling the command from the button on the site isn't working.
My button is triggering this action :
$process = new Process('php '.$kernelRootDir.'/console my:super:command');
$process->start();
As the command is kinda heavy, I can not afford to wait for the command to be completed, which is why I'm using process->start rather than run. I'm not in need of logs for it, so that's fine that it just starts the command and let it run while the user can go on another page.
And again, this works great on my local environment (debian 7 VM), but not on the distant server (not sure what it is using). However, if I manually launch the command via console or let the crontab call it, it runs perfectly until the end. That's just triggering it from my controller via Process that doesn't work.
This is pulling my hair off my head since yesterday, but I can't figure out why the command isn't even starting on the prod environment.. Any tips ?
Update : Tried to make my command only dumping a small "it worked" message, and using wait() and getOutput() methods to get the result in my controller. On my local environment, I instantly got my message as a result from the controller using dump/die combo. On the distant server, my command trigger a ProcessTimedOutException : the process exceeded the 60 seconds timeout.
So there is really a problem with Process being unable to launch a custom command, though that command works when called manually on console.
Okay, after struggling since two days, I found that the distant server might be responsible of this, it seems that if I don't wait for the process answer, that process is killed instantly. I don't know why, I don't know how, but it seems like it is.
So I started searching for an alternative solution that was not using Process. And I found something. I can use the php exec() method to launch an asynchronous call by doing so :
exec('php '.$kernelRootDir.'/console my:super:command > /dev/null &');
//Apparently, using "&" launch the command asynchronously.
All my tests have been working on my production server so far, so I'll keep that.
Whenever I make a simple Ajax POST request in Symfony, there seems to be a rather long waiting time before the response. Both in dev and prod environment.
I enabled XDebug to get a better insight. Here is what's going on on the server:
It seems to be Symfony's ClassLoader causing that long cycle. Page loads are similarly slow, generally over half a second.
I am running the application on localhost, on a Windows machine.
Is there any way I could reduce this execution time?