I'm using a ftp upload script to transfer one server to another. It works fine in a browser window, and works great from an ssh command line by running:
php /var/www/vhosts/domain.com/httpdocs/ftp_shell.php
but when I run the exact same command from a cron, I get this:
PHP Warning: ftp_get(ftp.tmp): failed to open stream: Permission denied in /var/www/vhosts/domain.com/httpdocs/ftp_shell.php on line 516
PHP Warning: ftp_get(): Error opening ftp.tmp in /var/www/vhosts/domain.com/httpdocs/ftp_shell.php on line 516
I looked on line 516, which is:
if((ftp_get($ftp_from,"ftp".$tmp.".tmp",$fname,FTP_BINARY) ||
ftp_get($ftp_from,"ftp".$tmp.".tmp",$fname,FTP_ASCII)) &&
(ftp_put($ftp_to,$fname,"ftp".$tmp.".tmp",FTP_BINARY) ||
ftp_put($ftp_to,$fname,"ftp".$tmp.".tmp",FTP_ASCII)) &&
ftp_site($ftp_to,"chmod ".base_convert($chmod,10,8)." $fname") &&
f_size()&&unlink("ftp".$tmp.".tmp")) {echoout(" (".$size_b." bytes) ok<br>");$count_f++;}
I know that it's writing the file to a temp file, but why would it allow me to do it in the browser and command line, but not a cron?
Probably the temp file was created by php script(apache user) and you want to read it from cron job (root user). Temp files are not readable from one users to others for security reasons.
You have to change your cron job to be executed by apache user, or you can alter the permissions of the temp file to be readable for everyone, but this is not recommended as any other procesess could read the file.
Jobs run under cron will have a different default working directory than whatever you're running from a webserver and/or the command line. Most likely whatever directory IS being used as the default while in cron does not have appropriate permissions for the cron job to use.
Try using absolute paths, e.g.
ftp_get($ftp_from,"/real/path/to/file/ftp".$tmp.".tmp",$fname,FTP_BINARY)
^^^^^^^^^^^^^^^^^^^
instead of the relative paths you're using otherwise.
The error says:
PHP Warning: ftp_get(ftp.tmp): failed to open stream
and the line is:
if((ftp_get( $ftp_from, "ftp" . $tmp . ".tmp", $fname ...
so the error must be in a empty $tmp variable
Related
I am new to PHP and I have little experience with manipulating files with PHP; I keep getting a 'permission denied' warning message and can't seem to figure out a solution.
I've made a simple web application with one php file that has just these two lines:
$file = fopen( '/Users/<myname>/Desktop/file.txt', 'r' );
var_dump( json_decode( $file ) );
to see whether the variable has been set.
When I run the script on chrome, I get:
Warning: fopen(/Users/<myname>/Desktop/file.txt): failed to open stream: Permission denied in /Applications/XAMPP/xamppfiles/htdocs/testwebsite/filereader.php on line 1
NULL
Presumably, the NULL is because $file has not be set or is empty.
I have read numerous posts online and I am still very confused about the explanations given. From my understanding, it is a permissions problem where I have not given the right privileges to some file.
I have tried many command lines such as chmod 755 /Applications/XAMPP/xamppfiles/htdocs/testwebsite/filereader.php to give permission to the script to at least be able to read a file but to no avail.
I have also read that it might be due to some user privileges that were not properly set so I've ran these to command to see who the user was:
echo get_current_user();
return 'daemon'?, a program that runs in the background?
exec('whoami');
returns '<myname>', my current machine's name
I am not sure how to proceed from here, whether to give permission to my machine that runs the script or 'daemon'.
I just want the application to run on at least my personal computer.
I have a php server running some code in background using Gearman server.
The command line php script is running in the background. In the middle, there is one server (walker) which is always running and listening to some events. When I listen to some specific event it fires another command line command using php exec ("some command").
If this server is running fresh, then it works ok, but after some hours it starts giving:
sh: 0: getcwd() failed: No such file or directory
where we use exec();
Any idea on how can I prevent this?
This error indicates that getcwd() returned NULL with errno set to ENOENT.
getcwd() will return ENOENT if the current working directory was unlinked. This seems to be the case here (according to the manpage, that's the only condition that causes getcwd() to return ENOENT).
Double check to make sure that no one deleted the directory while the server is running, or if the server code itself doesn't call unlink() on the current working directory. Someone, somewhere, is deleting it.
As a good practice, it is often recommended for daemonized processes to chdir() to a known directory where the daemon will perform its duties. This is precisely to avoid this sort of problem (and also because a daemon running in a separate filesystem could prevent the administrator from unmounting that filesystem).
I am trying to run php script via execute shell in Jenkins but it seems i am missing something.
here is my command in execute shell of Jenkins
#!/usr/bin/php
php /home/admin/reports/test.php"
I am not getting any error in console output.
and when i try these commands:
#!/bin/bash
php /home/admin/reports/test.php"
then I get error which says failed to open stream: Permission denied in /home/admin/reports/test.php" .
Jenkins, with default installation, will be run under jenkins user. That user has no access to your /home/admin home directory. The Permission denied is pretty self-explanatory.
Either give jenkins user read access to /home/admin (not recommended)
Or place the file into /home/jenkins directory (not best solution either)
Or better yet, have the file available in shared location accessible by both, preferably the job's workspace that is populated through the SCM (recommended).
I have a php script that connects to a database and does inserts/updates, obviously, if I run the script using the browser, it executes successfully, moving it into production this won't happen.
I would like to ask how can I invoke the php.exe using a batch script that then runs the php script I have created to do the inserts/updates.
I have the ff: setup
I'm using xampp, with the ff:
webdirectory/cronjobs/cronjob.php -> does the inserts and updates
webdirectory/cronjobs/runscript.bat ->will run the cronjob.php
runscript.bat
#echo off
php.exe -f ../webdirectory/cronjobs/cronjob.php
cronjob.php
include_once('../db/dbcon.php');
$test = $db->run();
.... insert/update codes
Additionally, when I try to run the batch script directly using command line for Win7, using drive:>/xampp/php/php.exe -f drive:>xampp/htdocs/webdirectory/cronjobs/cronjob.php
I get the ff: error;
Warning: include_once(): Failed opening '../webdirectory/db/dbcon.php' for inclusion (include_path='.; E:\xampp\php\PEAR') in E:\xammp\htdocs\webdirectory\db\dbcon.php on line xx
I guess the relative file path is failing; Follow-up question tho, does that mean I have to edit my script file and set the path to absolute? Instead of using ../webdirectory/db/dbcon.php , I should use e:/xampp/htdocs/webdirectory/db/dbcon.php ? Looks like a bad thing to me...
Can somebody please help?
change the directory to the location of the script to make all relative paths work when beeing executed from outside the script's directory
cronjob.php
chdir(__DIR__);
include_once('../db/dbcon.php');
Cron is not working properly. i have created a file inside /etc/cron.d with following command
$ touch /etc/cron.d/php-crons
$ chown www-data /etc/cron.d/php-crons
i got error like (*system*php-crons) WRONG FILE OWNER (/etc/cron.d/php-crons)
so i changed file owner as root
$ chown root /etc/cron.d/php-crons
even though cron is not working.
my php file (cron.php) is as follows
$fp = fopen('/etc/cron.d/php-crons', 'a');
fwrite($fp, '10 * * * * root usr/bin/php PATH TO SCRIPT/email.php'.PHP_EOL);
fclose($fp);
when i open the /etc/cron.d/php-crons there i can able to see the job.
10 * * * * root usr/bin/php /var/www/PATH TO SCRIPT/email.php
In email.php i included
#!/usr/bin/php
mail ("examplemail#gmail.com", "Cron Successful Public HTML!", "Hello World from cron.php!");
if i change (/etc/cron.d/php-crons)file owner as root and then run cron.php in browser , then i not able to write anything inside /etc/cron.d/php-crons and get warning as follows.
Warning: fopen(/etc/cron.d/php-crons): failed to open stream: Permission denied in /var/www/cron.php on line 2 Warning: fwrite() expects parameter 1 to be resource, boolean given in /var/www/cron.php on line 3 Warning: fclose() expects parameter 1 to be resource, boolean given in /var/www/cron.php on line 4 .
please someone guide me!!
There are several issues. As you noticed, your implementation of the cron daemon does not allow to be the files owned by non-root-users and you will not be allowed to make the file world-writable for the same reason of security.
Your cron job line, at least in the example, lists a relative path ("usr/bin/php") that should be absolute. That your cron.php file is owned by root does not mean it is run as root user. It is executed and run by PHP because it has the appropriate group and/or other permission bits. And you shouldn't change the permissions in a way it is run as root (e.g. with setuid-chmods).
What you probably should do is a cronjob line that acts as a wrapper to a PHP script. The PHP script reads all your jobs from a database and executes them, or even a text file similar to what you already created.
This way you can run the wrapper script as a non-root user. Most cron implementations allow you to name the user (e.g. www-data) on others you use the "su" command before your script (see man 5 crontab etc.).
which operating system are you using? Normally for something like this you want to use per user crontabs. so the commands are run as that user. /etc/cron.d is used for system crontabs. (/var/spool/cron/crontabs) is where user crontabs are stored, but they will need to be enabled. Normally by adding the user that is allowed to use them to /etc/cron.allow. Allowing the web process to make changes to files that run as root is really a bad idea. With the per user option it would only be able to touch things that it could normally touch anyways, so less of security risk.