I'm currently using the Symfony Process component, which relies on the proc_open function of PHP. I need to launch a command to wkhtmltopdf, which has this form :
/usr/local/bin/wkhtmltopdf --window-status "___RENDER_PDF___" --orientation "portrait" --run-script "window.basilOptions = {storages: ['memory'] }; document.body.addEventListener('status:app:rendered', function () { window.status = '___RENDER_PDF___'; });" "http://localhost/p/lps#poll/lpsp002" "/tmp/pdf_d6fbWO"`
When I run this command directly on my shell, it works just fine and takes about 6 seconds to be executed. But when I'm using PHP with the Process Component, it goes into a timeout... And when using the exec or proc_open functions, it runs indefinitely. The default timeout for the Process Component is 60 seconds (even extending it doesn't have any effect).
I tried this on PHP 5.4 and PHP 5.5, but the result seems to be same.
Any ideas why this command runs just fine on the shell but not through PHP ? Tested on MAMP and not MAMP environments (on 5.4 and 5.5 versions). It works on Ubuntu 14.04 with PHP 5.5 though.
I thought that maybe, when getting through PHP and MAMP, the process could be completed but still hangs as was reported with this bug ? I'll update when i'll have some more information, to see if the PDF is indeed generated or not.
Thanks.
The problem could be caused by session locking - current PHP page thread locks session for writing, so another page on the same server waits for session to be unlocked (on sessions_start() function) to process request (dumping into PDF). This creates dead lock, because session by default starts with write lock for other threads.
To prevent this session deadlock, close session for write by adding session_write_close(); before command and if you need write to session again just after PDF is rendered add session_start(); after PDF rendering
Related
I have a Windows Server 2016 VPS with Plesk and PHP 7.1x.
I am trying to execute a simple AutoHotKey script from PHP using the following command:
<?php shell_exec('start /B "C:\Program Files\AutoHotkey\AutoHotkey.exe" C:\inetpub\vhosts\mydomain.com\App_Data\myahkscript.ahk'); ?>
This is the only line on the page. I have tried different ahk scripts, the current one simply creates a MsgBox.
When I execute my php page, on VPS Task Manager I see three processes created with the expected USR: cmd.exe, conhost.exe and php-cgi.exe. However, my PHP page just sits waiting on the server and nothing actually happens on the server.
I have also tried the same line except replacing shell_exec with exec. This seems to make no difference. I have tried without start /b with both commands. In that case the PHP page completes but no new processes are started.
I cannot find any errors in any logs: Mod_Security, Plesk Firewall, IIS.
Any ideas?
EDIT:
I tried my command from the VPS command prompt and immediately slapped in the face with the obvious issue of the space in 'Program Files'. I quoted the string as shown above and the command works. This eliminated the hang when running from PHP. However, the command still does nothing when executed from the web page.
EDIT:
Based on suggestions from the referenced post 'debugging exec()':
var_dump: string(0)""
$output: Array()
$return_val: 1
One point was that I would probably not be able to invoke GUI applications. That puts a damper on the idea.
I experience problem with PHP request on self. In example I will use file_get_contents() but same happen for exec('wkhtmltopdf [*SELF*]') or curl()
lets name my server example.com
apache2 installed
FastCGI (multiple PHP versions 5.3, 5.4, 5.5, 5.6, 7.0)
now I have 2 dummy scripts
1st script
//get-html.php
file_get_contents('http://example.org/index.html')
2nd script
//get-php.php
file_get_contents('http://example.org/index.php')
Testing
1) command-line: php get-html.php // Success
2) browser: example.org/get-html.php // Success
1) command-line: php get-php.php // Success
2) browser: example.org/get-php.php // Timeout
What I tried next
create subdomain like subdomain.example.org/index.php to have differet PHP version for get-php.php and for index.php
amend /etc/hosts
request on other sites (like google.com) // Success
session_write_close() before file_get_contents() and session_start() right after does not work also
So my suspect is mod_fastcgi. It seems like the apache is not able to run 2 instances of this to handle PHP requests which comes from itself. As running script from command line works as expected.
Does anybody have any advice?
I did not set PHP_FCGI_CHILDREN what is in default 1.
When I called in PHP script another PHP script from my server over apache it failed as it was not able to create another PHP FCGI instance.
I'm using PhantomJS 64 bit in my PHP application to dynamically capture an HTML page to be emailed to the user.
phantomjs rasterize.js "http://..." /path_to_images/image.png
This method works fine when I run the above on the command line but when the PHP script runs the command using exec it fails with no output and returns exit code 11.
If I switch it to use the 32 bit phantomJS binary, the command succeeds but fails to load the google JSAPI on the page since with error Reference Error: can't find variable google. This is a problem because not all of the page content is loaded and captured as an image. The JSAPI is included using HTTPS. If I switch to HTTP, the reference error is gone but the rendered image comes out all black.
I tested the command as the same user that php is running as.
To sum it up:
command> phantomjs_64 rasterize.js "http://..." /path_to_images/image.png
OK
exec('phantomjs_64 rasterize.js "http://..." /path_to_images/image.png');
No Output, Exit Code 11
command> phantomjs rasterize.js "http://..." /path_to_images/image.png
exec('phantomjs rasterize.js "http://..." /path_to_images/image.png');
Incomplete Output
Does anyone know why the default phantomJS rasterize.js script would fail when running on PHP or have a workaround for this?
UPDATE: This great article by Arlo Carreon points out how to make this work on HostGator shared hosting (this was my problem). Simply add 2>&1 at the end of the command to redirect the output. The 64 bit version still does not work but this fixes the 32 bit version.
It turns out that this only happens when the PHP script is requested through the Apache web server. The workaround is to create a database entry for users that need to receive the email and setup a cron to execute the PHP script that calls PhantomJS for each user entry in the DB. When the cron is set to run at the smallest interval the user perceives that the email was generated and sent immediately.
I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!
shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;
First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.
A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.
Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/
Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.
I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.
I have a PHP script that calls a .bat file using system(). The output is written to the screen and I derive some values from parsing this output. This is running on windows 2003 IIS server. PHP v5.2.0
Specifically I am using this script to launch an Amazon EC2 instance and assign an IP address to it. It has worked great for me so far but recently the problem started.
Here is the code
$resultBatTemp = system("cmd /C C:\Inetpub\ec2\my_batch_file_to_launch_instance.bat");
$resultBat = (string)$resultBatTemp;
$instanceId = substr($resultBat, 9, 10);
...
Once I have this instace Id I can run another batch file that calls associates an ip address with this instance. It would appear that the instance does get launched but I never get the output on the screen.
For some reason this has all stopped working, the page freezes and never refreshes. I also need to completely exit safari or mozilla otherwise all pages from the website fail to load. Only when I relaunch the browser can i view the website again. I've connected to the webserver that hosts these scripts and checked PHP error log but nothing shows there. I've opened a DOS prompt and entered the code from the bat file that way and it connects to amazon and launches the instance fine. Ive isolated this bit of code and removed the system command and the rest of the script runs fine, so it appears that the hold up is with outputting the results of the bat file.
Recently I have purchased a new domain name for the site so this script is running from this domain. Might this cause the problem?
thanks
------------------------------------------------UPDATE-----------------------------------------------
Well hope this helps someone, I didnt find out what was wrong but created a new PHP file with a simple system command that called a .bat file, and a non-existent .bat file expecting to get an error back but nothing - just the usual hang for ages. So I restarted IIS and this fixed the problem. Dont know what was wrong but that did the trick.
Maybe first check what the system() call returns. According to documentation it will return FALSE in case of failure. Also, including your my_batch_file_to_launch_instance.bat in the question might help in solving it.
Try using the passthru function
Also make sure that all your commands are safe use escapeshellarg() or escapeshellcmd() to ensure that users cannot trick the system into executing arbitrary commands.