Running a SlimerJS + headless Firefox script through PHP on webserver? - php

On my server (Ubuntu 14.04.4 LTS), I have a Firefox installed, as well as xvfb for headless Firefox operation, and CasperJS with SlimerJS. I also have a CasperJS script which works fine. I want to utilize this script from PHP; this is the essence of my PHP script for this, let's call it mytest.php:
echo "php_sapi_name() " . php_sapi_name() . "\n"; // "cli" for php cli, "apache2handler" for php via webserver
chdir(dirname(__FILE__));
$nodeModPath = "/home/USERNAME/.nvm/versions/node/v4.0.0/lib/node_modules";
putenv("SLIMERJSLAUNCHER=/usr/bin/firefox46");
$cmdline = "xvfb-run $nodeModPath/casperjs/bin/casperjs --engine=slimerjs --debug=true mySlimerScript.js";
$returnString = shell_exec($cmdline);
echo "$returnString\n";
EDIT: Note that the command could as well be just:
$cmdline = "xvfb-run $nodeModPath/casperjs/bin/casperjs --engine=slimerjs --debug=true 2>&1";
... that is, without any JS script listed - in which case the help should be dumped (and is, in case of CLI access - but the same error as below is reported when accessing through webserver)
When I run this PHP script from the terminal command line (via SSH), that is through PHP in CLI mode:
$ php mytest.php
... everything runs fine, there is no problem whatsoever.
However, when I invoke this PHP script online through the webserver, that is via http://example.com/mytest.php, it fails first with the error:
Gecko error: it seems /usr/bin/firefox46 is not compatible with SlimerJS.
See Gecko version compatibility. If version is correct, launch slimerjs
with --debug=true to see Firefox error message
... and after adding --debug=true (as already included in the example above), I additionally get this error:
JavaScript error: resource://gre/modules/FileUtils.jsm, line 63: NS_ERROR_FAILURE: Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIProperties.get]
So, apparently my headless Firefox does not want to run, when PHP is invoked through the webserver (in which case, PHP reports that it uses the apache2handler SAPI).
Would anyone know why this happens - and how can I get the script to execute properly when called from a webserver, just as when it runs under PHP CLI mode?
EDIT 2: Can now reconstruct this error via CLI mode too, and can confirm it is due to the user; so without any JS script provided in the $command, I get this:
$ sudo -H -u root php mytest.php
...
Usage: casperjs [options] script.[js|coffee] [script argument [script argument ...]]
casperjs [options] test [test path [test path ...]]
casperjs [options] selftest
...
$ sudo -H -u www-data php mytest.php
JavaScript error: resource://gre/modules/FileUtils.jsm, line 63: NS_ERROR_FAILURE: Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIProperties.get]
Gecko error: it seems /usr/bin/firefox46 is not compatible with SlimerJS.
See Gecko version compatibility. If version is correct, launch slimerjs
with --debug=true to see Firefox error message

Well, this was a nasty problem. I ended up doing an strace, and comparing the logs, for the root user and the www-data user when running a full slimerjs (the full command line can be found by adding echoes to /path/to/slimerjs-0.10.1-pre/slimerjs):
sudo -H -u www-data strace \
/usr/bin/firefox46 -app /path/to/slimerjs-0.10.1-pre/application.ini \
--profile /path/to/firefox-46.0.1/profile-46 -no-remote --debug=true /home/USERNAME/.nvm/versions/node/v4.0.0/lib/node_modules/casperjs/bin/bootstrap.js --casper-path=/home/USERNAME/.nvm/versions/node/v4.0.0/lib/node_modules/casperjs \
--cli 2>&1 \
| tee /tmp/strace.log
sudo -H -u root strace \
/usr/bin/firefox46 -app /path/to/slimerjs-0.10.1-pre/application.ini \
--profile /path/to/firefox-46.0.1/profile-46 -no-remote --debug=true /home/USERNAME/.nvm/versions/node/v4.0.0/lib/node_modules/casperjs/bin/bootstrap.js --casper-path=/home/USERNAME/.nvm/versions/node/v4.0.0/lib/node_modules/casperjs \
--cli 2>&1 \
| tee /tmp/straceR.log
If these logs are now compared in say meld, then the eventually start diverging at a point like this:
mkdir("/root/.innophi", 0700) = 0
mkdir("/root/.innophi/slimerjs", 0700) = 0
... [vs.] ...
mkdir("/var/www/.innophi", 0700) = -1 EACCES (Permission denied)
access("/var/www/.innophi", F_OK) = -1 ENOENT (No such file or directory)
So, casperJS basically tries to create a directory in the home directory of the user; the problem is , www-data's $HOME is /var/www, where it seemingly has no write access!
So, the easiest thing for me was to "hack" the $HOME environment variable in the mytest.php script, and set it to /tmp, where www-data definitely has write permissions:
...
putenv("SLIMERJSLAUNCHER=/usr/bin/firefox46");
putenv("HOME=/tmp");
...
... and whaddayaknow, finally the script works under the www-data user from CLI too:
$ sudo -H -u www-data php test_commands.php
...
Options:
--verbose Prints log messages to the console
--log-level Sets logging level
--help Prints this help
...
Btw, this .innophi directory seems to also be mentioned in https://docs.slimerjs.org/current/configuration.html#profiles ,

Related

Execute qpdf in php shell_execute

I have installed Lampp-x64-5.6.3 in my OpenSuse 13.2 OS. I have built a program which require the execution of qpdf which I have installed from the OpenSuse Repo itself.
Well when I run the commands (given below) I get no response & nothing works at all whereas I am able to execute other binary files within the /usr/bin/ directory.
$execQuery = "/usr/bin/qpdf --decrypt --stream-data=uncompress --force-version=1.4 ".escapeshellarg('/opt/lampp/htdocs/test/test.pdf')." ". escapeshellarg('/opt/lampp/htdocs/test/temptest.pdf');
shell_exec($execQuery);
#OR
$execQuery = "/usr/bin/qpdf '/opt/lampp/htdocs/test/test.pdf' '/opt/lampp/htdocs/test/temptest.pdf'";
shell_exec($execQuery);
PHP safe_mode is off, shell_exec, exec, system etc are enabled. Still I am unable to run this particular binary (/usr/bin/qpdf).
I am getting output when I run the commands echo or ls -l or dir or even skype in the php shell_execute functions.
The permission for the file is: -rwxr-xr-x 1 root root 85248 Jun 18 10:31 /usr/bin/qpdf
But however I am able to execute qpdf command via Terminal of the OS. and it creates the file perfectly.
The directory /opt/lampp/htdocs/test/ is writable by both qpdf and apache/lampp
I have tried almost all methods mentioned in various forums but still can't get this executable run the file.
Thanks in advance.
UPDATE:
As suggested tried out this one:
$command = "/usr/bin/qpdf --decrypt --stream-data=uncompress --force-version=1.4 ".escapeshellarg('/opt/lampp/htdocs/test/test.pdf')." ". escapeshellarg('/opt/lampp/htdocs/test/temptest.pdf');
shell_exec($command. " > /opt/lampp/htdocs/debug.log 2>&1");
The errors are logged!
......
/opt/lampp/lib/libstdc++.so.6: version `GLIBCXX_3.4.9' not found
......
SOLUTION:
I simply had to delete the /usr/lib/libstdc++.so.6 file or rename it.
RUN in terminals:
sudo mv /usr/lib/libstdc++.so.6 /usr/lib/libstdc++.so.6___
I simply had to delete the /usr/lib/libstdc++.so.6 file or rename it.
RUN in terminals:
sudo mv /usr/lib/libstdc++.so.6 /usr/lib/libstdc++.so.6___

Not able to run wkhtmltopdf commad through shell_exec() function in php but same command works on command line

I'm in trouble and that much confused about a php shell_exec command.
When the command is execute by PHP I have no error but the execution fails. If I use exactly the same command from a terminal it works.
Here's the command :
/usr/bin/wkhtmltopdf --lowquality --dpi 300 --encoding utf-8 "/tmp/knplabs_snappyxa9otq.html" "/tmp/knplabs_snappyv3pD7h.pdf"
When I lauch this from a terminal :
$ /usr/bin/wkhtmltopdf --lowquality --dpi 300 --encoding utf-8 "/tmp/knplabs_snappyWG9XTd.html" "/tmp/knplabs_snappyv3pD7h.pdf"
Loading page (1/2)
Printing pages (2/2)
Done
But from my php script :
// Construct the previous command
$command = $this->buildCommand($url, $path);
../..
shell_exec($command);
../..
$content = file_get_contents($path);
../..
I've test the output of shell_exec, it's empty.
The log :
Warning: file_get_contents(/tmp/knplabs_snappyv3pD7h.pdf): failed to open stream: No such file or directory in /*****/lib/snappy/SnappyMedia.class.php on line 64
No permission pb in the /tmp directory :
$ ls -la /tmp
total 448
drwxrwxrwt 16 root root 4096 mars 12 21:51 .
../..
I've tried avec the PHP exec() function to get error informations, I just get an "1" error code in return_var and nothing in output.
For information this issue appear on my test server, my desktop computer but not on my notebook. All the 3 are with sames PHP, Apache, Mysql versions.
I don't understand anything ...
Thanks for any help, I'm loosing my mind.
David.
I've found the solution here : Executing wkhtmltopdf from PHP fails
Thanks to Krzychu.
First to get information from the shell_exec command add " 2>&1" at the end of the command. In that way you will get information in return of the command :
$no_output = shell_exec($command);
echo $no_output; // nothing
$output = shell_exec($command . ' 2>&1');
echo $output; // in my case : "cannot connect to X server"
The solution :
Not use the wkhtmltopdf ubuntu package (0.9.9-4)
Use the official package from the Wkhtmltopdf download page
So no need to install xvfb ! (I've seen this advice many times)
Looks like a user's permissions issue.
When you run the command from the terminal, it is the user account, currently used, which does have the right permissions, to run a command in /usr/bin, and execute the specific file.
When you run it from the php script, it is the http server account on your system, which needs the permission to execute the file in /usr/bin. Usually this is the apache user.
How you should setup permissions depends on your system. Just remember that what is allowed for apache, is allowed for anyone accessing your http server.
I have had this problem for ages and adding . ' 2>&1' after the $command has somehow solved the problem.
this:
$output = shell_exec($command . ' 2>&1');
instead of:
$output = shell_exec($command);
No idea why but it works and I'm grateful.
Is it a shared hosting? It seems like shell_exec is a restricted function. Try running error_reporting(E_ALL); ini_set('display_errors', 1); before calling shell_exec.
I stumbled upon the same Problem, in my case an absolut Path in the exec Command like /var/www did not work, I had to use relative Paths from the point where I executed the php File.
I also wanted to notice, that it did not work using shell_exec, however it worked using normal exec command, not sure wheres the difference here.

How to access root packages through php?

Issue: Initially, Through command line as a root user,I accessed a package called pandoc (/root/.cabal/bin/pandoc) which was installed in root folder. When I try to access that package through php using shell_exec(),it fails.
Question: Is there any limitation for php shell_exec() not to access root packages for security purposes? If so,how to resolve it?
I tried: Gave write permission to root folder then I could access root packages through
command line not as a root user. yet I couldn't to access it through php shell_exec().
php code:
shell_exec("cd /home/quotequadsco/public_html/pandoc_jats ; sudo -u quotequadsco
-S /root/.cabal/bin/pandoc ex.tex --filter /root/.cabal/bin/pandoc-citeproc
-t JATS.lua -o ex.xml");
and also tried,
shell_exec("cd /home/quotequadsco/public_html/pandoc_jats ;/root/.cabal/bin/pandoc
ex.tex --filter /root/.cabal/bin/pandoc-citeproc -t JATS.lua -o ex.xml");
Expectation: I need to execute pandoc root package through shell_exec() in php.
Added the following line in the /etc/sudoer file
#Defaults requiretty //commented this line
usergroup ALL=(ALL) ALL
PHP code,
shell_exec("cd /home/quotequadsco/public_html/pandoc_jats ;echo password | sudo
-S command"); //added a password for sudo command to run as a root user.
I recently published a project that allows PHP to obtain and interact with a real Bash shell (as root if requested), it solves the limitations of exec() and shell_exec(). Get it here: https://github.com/merlinthemagic/MTS
After downloading you would simply use the following code:
$shell = \MTS\Factories::getDevices()->getLocalHost()->getShell('bash', true);
$return1 = $shell->exeCmd('pandoc (/root/.cabal/bin/pandoc)');
//the return will be a string containing the return of the command
echo $return1;

shell_exec throws: sudo: no tty present and no askpass program specified

I have been trying to execute a script using shell_exec() function in php:
I've written the following lines of code:
$command = "bash /path/to/my/script/ funciton_name() 2>&1";
echo shell_exec($command);
Inside the shell script I'm doing:
sudo rsync -avvc /source/path /destination/path
On executing this on the browser, I get the following error message:
sudo: no tty present and no askpass program specified
When I execute the same shell script on my server, it executes fine.
When I went through similar questions posted on this forum, I realised that I had to add the NOPASSWD line on my server which I found out has already been added in the following format:
User_Alias NOBODY=nobody,apache
NOBODY ALL=(ALL) NOPASSWD : /path/to/my/script
Also when I do:
echo shell_exec("whois");
I get the output as:
apache
Any assistance in overcoming this problem would be of great help.
sudo will require a TTY, even if you have set up it up to be passwordless, unless you explicitly do not require it. But as #Cfreak pointed out, it would be much better (simpler and safer) to avoid sudo by setting correct access rights (read it before continuing) in the first place.
rsync itself will not require root permissions on a sanely configured *nix install. To verify this, you can check that type -a rsync doesn't print anything weird like rsync is aliased to `sudo rsync' and that ls -l $(which rsync) prints sensible permissions (at least rx for everyone).

Running webkit2png from a php script works in Terminal, but not in browser

I have a following php script -
<?php
$command = "python webkit2png/webkit2png -D screenshots http://stackoverflow.com";
$command = escapeshellcmd($command);
system($command);
When run from Terminal by means of
php test.php
it produces the website screenshots, however, opening test.php in browser does not bring any results.
Both python and php scripts are owned by _www user, under which apache is running. I even tried running the test.php under _www in Terminal, it still works. Is there something I'm missing?
Thanks to #amccausl I found this in apache logs -
Wed Feb 27 07:12:03 mini.local python[83331] <Error>: kCGErrorFailure: Set a breakpoint # CGErrorBreakpoint() to catch errors as they are logged.
_RegisterApplication(), FAILED TO establish the default connection to the WindowServer, _CGSDefaultConnection() is NULL.
Traceback (most recent call last):
File "webkit2png/webkit2png", line 353, in <module>
if __name__ == '__main__' : main()
File "webkit2png/webkit2png", line 324, in main
AppKit.NSBorderlessWindowMask, 2, 0)
objc.error: NSInternalInconsistencyException - Error (1002) creating CGSWindow
The library you're using is trying to establish a connection to your xserver to render a png. This works fine on terminal, because you have a connection available, but will break for ssh or apache sessions because they don't.
You can create one for their use with xvfb
The approach used in this question is a good example for you (you can ignore the solutions).
<?php
$command = "xvfb-run -a -s '-screen 0 640x480x16' python webkit2png/webkit2png -D screenshots http://stackoverflow.com";
$command = escapeshellcmd($command);
system($command);
you should use the absolute path like /usr/local/php53/bin/php
If you're not forced to use python and can install something else, I'd recommend http://phantomjs.org/. It's much better and powerful in making screenshots of webpages and doesn't need an xserver (but node.js)

Categories