Shell_exec with git pull? - php

I am setting up a github account, to work on a small project with some friends.
I would like to have my home machine able to do a git pull via php, so that we just have to call this small php file for the machine to be up to date.
As of right now :
<?php
$output = shell_exec('git help');
echo "<pre>$output</pre>";
?>
This works perfectly and I get the output, I am in the right directory, so git pull should work just as well, but I get a hanging page, no error, nothing.
Any idea ?
EDIT : A few precisions, the repo is pretty small, around 300K, it takes only a few seconds from the command line. I also tried shell_exec("dir"), and I am in the right directory. I am running the default installation of xampp on Windows 7 x64, if I can be precise enough :)

I suggest exploring set_time_limit() , as well as making sure your git pull does not stop if the user disconnects via ignore_user_abort(). Even running from a gigabit connected server, some repositories just take a while to clone.
Also, check PHP's working directory, and ensure the user running PHP has privileges to write to the repo. If you ran this via CLI and it 'just works', its a good chance that PHP was running without appropriate privileges when accessed via whatever web server you are using.
If you chmod the destionation directory as 777 and it works, there's a very good chance that you need to recompile apache/php for suexec support. Please, don't just leave it as 777 if that is the case :)
Either way, time out and user aborts are still valid considerations, even after you get it working.

So, to answer my own question.
It was in fact a permission problem (thanks tim), from the PHP CLI, the script was working.
The problem was that the service php installation is using some strange permissions.
So you/I need to start the PHP server via the command line (or in this case the Xampp control panel).
Now it's working, giving me the "Already up-to-date." answer I was waiting for :)

Related

Laravel 5 Heroku Local doesn't work

I've successfully deployed the Laravel application to Heroku.
It works online.
But when I try to run "heroku local" I get:
vendor/bin/heroku-php-apache2: No such file or directory
Which makes sense, since looking into "vendor/bin", the only thing listed is:
psysh -> ../psy/psysh/bin/psysh
So, where's my heroku-php-apache or how do I fix this?
You should have these lines in your composer.json
"require-dev": {
"heroku/heroku-buildpack-php": "*"
}
be sure to run composer update after you add them.
After extensive research, trial and error and talking to the Heroku Support team, I found out that, although Slow Loris's answer was a part of the process, the following answer was given to me by Heroku's Support:
To cut a long story short, heroku local is not officially supported for PHP >applications. The reason is that unlike all the other languages we support on the >platform, PHP has no web servers written in userland. Instead, we use PHP-FPM >together with Apache or Nginx, and the boot scripts (vendor/bin/heroku-(php|hhvm)-(apache2|nginx)) dynamically inject the correct configuration for port >binding and the FastCGI comms sockets.
This works with vanilla PHP and Apache builds, provided that:
1) the current user has all the correct permissions (in your case, >/var/log/apache2/ isn't writable);
2) the correct proxy modules are loaded in the main httpd.conf;
3) the main httpd.conf doesn't bind to a port at all, or at least not to one >under 1024 (which are reserved for superusers).
The main config also needs to be handled by each user on their own, because >sometimes, the modules to be loaded are in libexec/, sometimes in >lib/apache2/modules/, and so forth. Just too many variations; otherwise, we could >ship a full Apache config to users and the experience would be much better.
But the problems don't end there. FPM does not work at all on Windows, and on >most Linux systems, httpd is not a command that works; instead, apache2ctl >handles starting and stopping, and thus, running a server in the foreground is >not possible. In the end, there are simply too many possible permutations in >system configs that make it impossible to ensure every user has a great >experience.
It's simply the current reality in PHP land. Ruby, Python, Node, Java all have >web servers that are written in each respective language, and you don't need >external servers. Which also makes it possible to stream file uploads, handle web >socket upgrades, and so forth. Maybe with PHP 7 we'll see something like that >emerge soon (in PHP 5 it's simply not feasible at all, because a fatal error >kills the engine, so your web server would be gone too).
I know this question is a little dated but I recently deployed a heroku app for the first time and was unable to get heroku local to work for me. I'm on the current branch of Laravel which is 5.8, I am also on Windows 10 using VS Code. I searched all over trying to rectify this issue and could not get it to work no matter what.
I did come up with a solution to be able to work on this locally with only a few lines in terminal.
In VS Code, I used gitbash terminal, once in my heroku project folder composer require laravel/homestead --dev, once that is complete, then we need to install homestead, vendor/bin/homestead make, and then once that is complete, simply run vagrant up and your app will be accessible through localhost:8000.
Docs - https://laravel.com/docs/5.8/homestead
Hope this helps someone!

Can't write log file in Linux using PHP

I'm running CentOS 6.5 on a Google Compute Engine instance which I use for an ejabberd XMPP server. I also have php 5 installed and ejabberd is configured to use a php script to authenticate users.
So far so good - ejabberd executes the script and recieves the correct result from it. The problem is: I want the PHP script to write a log file. So far I've tried:
Writing a file using file_put_contents to /var/log/mlog.log - this didn't work. so I've tried manually creating the file and giving it chmod 777 (for testing). No result - the file remains empty. But - when I execute the script manually using php from terminal the log is written.
Writing to syslog - I've configured php.ini to use syslog and then tried logging. Same result: nothing when ejabberd runs the script, but when I manually run it it works.
Configuring error_log file and using error_log($message). Again, it didn't work.
I came to realize it must be something wrong with the write permissions of the ejabberd user (which runs the php scripts), but even when I set chmod 777 to every file in every option of the above, the log remains empty.
Any hints? What am I missing? (as you can probably tell, I don't have much knowledge in Linux and this is the first time I'm using it in a project)
This may not be the answer you are seeking. I am not much familiar with Linux. There is a KeyLogging php class know as KLogger. You can create logs using this class. It is very easy to use, You have to download php file and use it. You can find it in github. Hope this might solve your problem.

Running continuous PHP (script) background processes on a WAMP server

I got the 140dev Twitter framework (which uses the Twitter phirehose) manually
running (via the webbrowser on my local wamp server), but I can't
figure out how to run both get_tweets.php and parse_tweets.php as a
background process like with SSH commands:
nohup php script.php > /dev/null &
Some of you started using (the Windows equivalent of) cronjobs, but
this isn't the right way to go. I think this is because of creating
multiple connection (or re-connections) to the Twitter streaming phirehose isn't allowed?
How can I run both PHP scripts (get_tweets.php and parse_tweets.php)
as a background process on my local WAMP server (and later on a VPS)?
Just to clearify:
I am using a WAMP server (first to test a little bit and later to
run it on a VPS)
Using LAMP or any *nix server/system isn't an option (due to time,
experience and lack of skills)
I have searched for solutions (on google and stackoverflow), but they are either not working or not clear enough for me (I am new to this)
Thank you in advance.
Find the php/bin folder where the php.exe is located. Copy the folder path and add it to your PATH environment variable (Follow this for instance to edit your PATH variable.
Once this is done, you'll be able to execute php in the command line from anywhere. Just start php script.php with a command line in the right folder and it should work. There might be some configuration to make so that the php in command line uses WAMP's php.ini.

PhantomJS doesn't work in PHP through browser (but does via command line, and even by running PHP through command line)

I'm trying to get PhantomJS to run via PHP.
When I run the JavaScript file directly through the command line, it works fine. When I run php render_html.php in the command line, which just runs an exec(), it works fine. However, when I try opening this php file in the browser, it does nothing. I don't even get anything back to echo.
I've done all this testing locally on OS X, and on my EC2 server, and I get the same result.
It might be a permission problem, check if the user running the web server has permission to run the phantomjs executable.
(Posted on behalf of the OP).
It does seem to have been a permissions issue. It seems that Apache may be treated a bit differently than other users? Even though the standard permissions would allow any user to access the PhantomJS application, apache still was unable.
Anyone with similar issues should read through this question and the answers.
Here is my slightly adapted solution for my specific case (on the Linux server). Edit your sudoers file (/etc/sudoers) to include the following line:
apache ALL=NOPASSWD: /usr/bin/phantomjs
I did this with emacs, (C-x C-q to enable editing of read-only file). I didn't seem to have any issues with that, but I guess using something called visudo is recommended for editing sudoers. Emacs worked for me, but you should look it up.
This gave the apache user explicit access to the PhantomJS app by giving it sudo access, limited to only the PhantomJS app.
I recently worked with phantomjs in centos 7 environment and faced same problem as op. I tried solutions and methods which I found in stackoverflow on top of each other, so I am not quite sure which method has worked or which methods has worked together to solve the problem. I tried
giving 777 permission to phantomjs binary and target js and containing directory (did not work)
visudo and giving permission to apache to use phantomjs with no password (did not work)
added apache to sys group (did not work)
disabled selinux
after disabling selinux and restarting cent os, phantomjs started working in browser. :D :D

PHP -> python seems to not work on web server

I have a few pages (one html, one php, and one python script) that takes a user input, and then outputs a pdf file (using reportlab) for the user. When I test on apache (localhost), it works perfectly.
However, online on a real web host it is not working. The version of PHP on the web server is 5.2.17 and python is installed.
Could the version be an issue? I am very lost because I'm not very experienced and it works perfectly on Apache and not at all on the internet.
Last thing, the command I use to call the python function from php is this:
$ed = exec("python pdfgeneration2.py $name $age");
I also thought maybe there is a better command for this?
My question may be vague and unclear but if anyone has any ideas it would be greatly appreciated.
Look at /var/logs/httpd/error_log, or wherever your distro's apache stores logs. Odds are you'll find an error message there from PHP.
In addition to checking the path like sberry recommended, double check perms, too. Apache will need execute permissions on the python script: chown apache:apache pdfgeneration2.py

Categories