How to run a command in .bashrc file without affecting SFTP connection? - php

I dont know why everytime that I tried to make the .bashrc run a command affects my SFTP connection from other softwares...
This time Im just trying to start my apache server automatically because the servers restarts every day at midnight so, Its kinda annoying to go to the command line and restart my apache server.. big fail but I guess, I cant do much about my school policies or whatever they do with their servers..
What I was simply adding at the end of this file .bashrc it was this
~/apache/bin/apachectl start
but this simple command creates immediately conflict with my connection SFTP using other softwares.. So, Im not sure which is the proper way to do this..

You really really do not want to put this in your .bashrc. The bashrc gets executed for EVERY shell you start. You could potentially be running this command hundreds of times per login session.
Read the bash man page and figure out which of the various start up files you want this in, my guess would be the .bash_profile script. Even then it seems very odd to me...
http://www.linuxfromscratch.org/blfs/view/6.3/postlfs/profile.html
However, to avoid the sftp problem you need to make sure that your .bashrc script does not write anything to STDOUT. If you run a commmand inside it redirect the output to /dev/null or a logfile. In general it's a very bad idea to run any commands in the .bashrc, you should mostly be setting configuration variables.
Running commands in .bash_profile is sort of ok, but in the long run it almost always causes more problems that it solves.

Related

ssh2 execution lifetime on the remote side

I am trying to run a bash script using php-ssh2. The script must run forever in the remote machine (the only way to stop it must be with pkill). The problem is that somehow the connection is closed and the bash script is killed.
Nohup, disown and screen... I tried everything and nothing really changed, it simply doesn't keep that script alive.
What can I do?
(I know that this is a security hole (HUGE) but this is just experimental, the main idea is using an HTML button, run a bash script in the server computer, using apache2)
Create a frequent cron job (every minute?) that first checks some kind of a flag (e.g. existence of certain file) before running a job.
In the PHP code, only raise the flag (create the file).
Does the script generate output? If not, check out the keep alive option of ssh.

Running a looping PHP Script on my server

I have a windows pc with apache running, and I needed a php script to continuously run to listen to inputs coming from a UDP port, and take the required action and send it back.
The only way I know how to do this, is to install curl for cmd, and run the php script with a WHILE loop. What I am afraid is that this is the wrong way to do it.and may be unreliable and take up large amount of system resources.
Can people comment on the above method? I have heard of cron..but thats for unix only? What can I do?
Hey try this below solution.
Use a bat file and schedule to execute that bat file.
For example in the bat file executephp.bat, write this
c:\xampp\php\php.exe -f c:\xampp\htdocs\do_something.php
save that bat file that contains that line.
Go to windows scheduler and create a new task and in action tab, browse to point that executephp.bat and for start in -> direct to the directory u have that executephp.bat.
For example if u save the file under C:\xampp\htdocs put that C:\xampp\htdocs in the start in.
Remember to invoke the script even when the user is not logged on.
Everything is set and it will execute without problem.
A PHP script behind Apache will always have a maximum execution time, so the while-loop should always be stopped after the specific timeout.
You should better use cron or a batch script like Venkat recommended. There are some great services for cron out there, that will do a GET request to your server and run the script. Have a look at this related thread: Scheduled Request to my website from an external source
Doesn't that fit your needs?

Running a PHP script containing MySQL Queries in background with out using cronjobs

I have a php script as "save.php" which contains sql queries, and need to run this script in the background i.e., not in the browser. For calling this script I am using the exec() command in file named "trigger.php".
now when save.php is called by exec() it runs normal when the mysql queries are not used., but when I put the mysql queries the queries alone doesn't work and rest of the script executes fine. As far as I am thinking the mysql_connect is not able to run.
I cannot use cronjob because my need is different, I use need to trigger the script and it should be running continuously, but not in intervals.
So, is there a way where I can create a MySqL Server connection in the file which needs to be running in the background?
I have managed to figure out the problem.
I'm running MAMP for PHP and MySQL. MySQL's socket is created in the MAMP's MySQL directory but when we call a PHP script using exec(), the script is being triggered from the shell/command prompt which checks for the MySQL socket in the usual directory, which is /var/mysql/mysql.sock. Which is why we get the following error.
Warning: mysql_connect(): [2002] No such file or directory (trying to connect via unix:///var/mysql/mysql.sock)
So, I created a directory under /var and added a symbolic link to the actual socket using
sudo mkdir /var/mysql
sudo ln -s /Applications/MAMP/tmp/mysql/mysql.sock /var/mysql/mysql.sock
Works very well now.
Thanks everyone for the suggestions!
I'm not exactly sure what you're asking (I lost your train of thought)... but if you are trying to get your scripts to run forever or at least as long as your host allows you(?) without using cronjob, then you can use the set_time_limit($seconds) function at the top of your script to set the max execution time allowed by your host. At the end of the script, have it sleep($seconds) and then call exec() to execute itself again. Be sure to factor in the execution time, or you will miss the trigger.
If your sql connections aren't working, then you're going to need to figure out what is going on there. If you post some of your code, we can help you debug. mysql_connect and mysql_pconnect for persistent connections are the commands you will be using to establish connectivity with your database.
PHP scripts are not meant to be run as daemons (so many problems, of which the time limit is just the beginning) so you'll need a launcher.
pcntl_fork() is a simpler alternative to exec() if you're on a system that supports it.
As for your problem with exec(), the problem most likely has to do with the fact that the PHP interpreter you're executing has a different environment than the one that's running on your web server.
Try to execute phpinfo() on both environments to see what the actual differences are.
Though I must confess I can't see what would cause an "unable to connect to server" error; post the full error message, please. Can it be that a local firewall is blocking php.exe? Some other security measure, perhaps? Or could you be getting incorrect information regaring the MySQL server's hostname?
Try to run your exec'ed script from the command line. That'll get you close to what the web server is doing.

php exec multiple commands, apache restart

I have to run 2 commands through exec();
the first command is a wrapper calling for (Plesk panel) subsription,
the second is also a plesk command,for dns.
Note: After i execute an add subscription, the apache WILL RESTART!,
So my Question is:
can i call the exec somehow, to execute both commands at linux side without loss of the second command?
Ex:
exec(("/wrapper2 3 --create ... && /wrapper2 4 --update-soa example.com ... ) > /dev/null 2>&1 );
Php will send both commands to linux to execute, or it will restart apache after the first command, and then i can't execute the second command?
Thanks
Um... I'm thinking bad deal. Generally it is a bad idea for a process to tell its parent to restart while the process needs to keep running. But, even if it were a good idea -- Apache is the parent process of PHP in that context (do ps -A, you'll not see PHP), I can't imagine that it would let you restart it and keep running at the same time.
I'd approach it this way: if you can bridge a delay, then have a cron job look for whether a specific file exists, if it does, then execute the two command that you need it to. At a worse-case scenario, make PHP output a file which has the two commands you want run and then have cron run that file.
Well from my understanding the issue lies in the fact that Apache is going to be the parent of the script that is running, when Apache gets shut down so will the script.
Barring that you can deal with a sort of derp-y setup, you can set up a cron job that looks for when it needs to restart the server (either a file you created via touch or something from PHP), which can handle everything outside of the context of Apache's process.
A sort-of-dirty idea. :(
Put the commands in a shell script and execute that script. It's less complicated and just in case you can call it with other tools as well like on apache restart or via cron.
I think why the apache restart is your command executes too long or cost to much system resource and makes apache sub process exits.
Try using fastcgi mode instead of mod_php.
You can make a shell file to execute two commands.

CRON jobs keep erroring

We have several CRON jobs running on our apache / whm server. We are using php. The scripts all work completely fine when run from the browser.
The cron will throw back errors like: unable to include files (even when giving the absolute path).
Results will also vary, corrupting output files etc. I am really baffled, as sometimes the crons work fine as well. Seems really intermittent and they work every time perfectly when executed from the browser.
Any help would be appreciated, cheers.
As everyone has pointed out, PHP CLI and the PHP Apache Module are separate software and they have separate configuration files.
Rather than set the crontab up on the root cron tab, make sure all your permissions are correct. Debug with the user they will run from cron as. Assuming you are running Linux, you can use
sudo -i -u username
for this.

Categories