I'm new to Perl and i'm using Windows Machine
Well i'm using PHP as well as Perl.. My complete web application was written in PHP. I have written a Perl Script for DB interaction for every 20 minutes. My Perl script goes like this.
use strict;
use warnings;
sub Process1()
{
use DBI;
use Date::Format;
my $DBH=DBI->connect("DBI:mysql:dbname:localhost","root","");
my $sth=$DBH->prepare("UPDATE tablename SET column1='blah'"); #update query
$sth->execute();
while (my #row=$sth->fetchrow_array)
{
print $row[0] . "\n";
}
}
while(1) {
&Process1();
sleep 900;
}
If I have to run my perl script, I would just open Perl Commad line and execute the command
perl C:xampp\htdocs\samplescript.pl
My questions are -
How do I run the same Perl script in my Linux server.
How to execute the command?
If I close the perl command line in windows, the script will be stopped. How to stop the same script in linux server?
IMHO, the best way to do this is to write your perl code as a simple "one-shot" treatment, and rely on the features provided by the system to run it repetitively:
Scheduled Tasks on Windows
Crontab on Linux
How do I run the same Perl script in my Linux server?
There are several ways to execute an perl script on linux.
You need to execute your Script with an interpreter. In the most cases you use:
/usr/bin/env perl /Path/To/Your/script.pl
Or
/usr/bin/perl /Path/To/Your/script.pl
Atleast there is a better way:
You define the Interpreter Path into the first row in your script.
#!/usr/bin/env perl
use strict;
use warnings;
sub Process1()
{
use DBI;
use Date::Format;
my $DBH=DBI->connect("DBI:mysql:dbname:localhost","root","");
my $sth=$DBH->prepare("UPDATE tablename SET column1='blah'"); #update query
$sth->execute();
while (my #row=$sth->fetchrow_array)
{
print $row[0] . "\n";
}
}
while(1) {
&Process1();
sleep 900;
}
Then you are able to execute your script with the following command:
/path/to/your/script.pl
How to execute the command?
You can execute the same command at the Linux Terminal?
If I close the perl command line in windows, the script will be stopped.
How to stop the same script in linux server?
You can use the linux "kill" command. If you run it in Background. The script will also stop if you close the Terminal Session or pressing strg+c.
You can also look at this module and build your own Perl Command to stop the script.
Getopt::Long
My Tip
Daemonize your Script and insert some commands to handle the script. So it can run in Background.
Explanation: Daemon(computing)
This may help if you didn't find them online.
To run the script in Linux perl "path"/script.pl. To run the script continuously in linux either use crontab or run the script via shell script and run it background.
And to stop the script in Linux find and kill the process using ps -ef|grep script.pl command and kill -9 "process_id" respectively.
Related
I'm trying to launch a julia program from a drupal website (running on xampp for Windows), it needs to run asynchronously in the background while the php script continues execution. This is the code I'm trying to use:
$juliaFile = escapeshellarg(DRUPAL_ROOT . '/sites/all/modules/tsap/Modeling/runme.jl');
$cmd = "start /B julia $juliaFile";
pclose(popen($cmd, 'r'));
This code works perfectly if I run it through a command line php script, but it doesn't work when it runs through apache, however, the next bit of code works both from command line and from the webserver (The only difference is running a php script instead of a julia program)
$phpFile = escapeshellarg(DRUPAL_ROOT . '/sites/all/modules/tsap/Modeling/runme.php');
$cmd = "start /B php $phpFile";
pclose(popen($cmd, 'r'));
I've also tried calling the first block of code within a php file that gets execute by the webserver, which also succeeds from the command line, and fails when the server attempts to execute it.
I also get issues using backtick operators and exec() (they block on the call) and using COM::run() results in the same issue as pclose(popen())
Does anyone have any ideas for getting the julia call to work?
Thanks for your time
add the '&' end of your command, which put the command in the cackground.
You also can use proc_open() to run the command background. You also refer the post of mine that use the proc_open() to run commands background.
I am trying to run the command prompt using a php script on camps but I am unable to do it.
I could run the Explorer using
exec("explorer");
But when I try to run
exec("C:\Windows\system32\cmd.exe");
It doesn't execute
How do it do it?
I want to run a command like
ping Google.com
This reads commands from the standard input and executes them:
while (true) {
$command = readline("Command: ");
passthru($command);
}
Note that if you run a command like ping, you might not be able to stop it this way (usually you would hit Ctrl + C). However you can specify the number of pings to send:
ping -c 3 google.com
You can use the "exec()" function to run the code in background. The command prompt won't be displayed but will directly run the code.
Now if you had to run a python script from php:
You could use
exec("py "location of python script");
exec('py C:\xampp\htdocs\pro\helloworld.py');
The output would be in the directory where your php script is present.
I have a file named /root/folder/myfile.php that will handle incoming packets from a specific port by a GPS device.
When I use [root#main ~] php /root/folder/myfile.php, everything works fine.
I need this file run every second to listen.
I researched for a while and figured out that using php cli is a solution, so I tried above command but as long as the shell is open (I'm using PUTTY), file is executing and when I close the shell, process will be killed.
How can I (where can I) add a command that will run this file every second, or may be in realtime?
I'm using linux centOS 6.5.
Thanks in advance
nohup php myscript.php &
the & puts your process in the background.
The solution from Run php script as daemon process
To kill it:
1) display all running proceses with: ps aux | less or top command
2) find pid(process id) and kill with: kill pid
You would want to use the cron functionality of your server.
Similar to this maybe:
running a script from cron every second
I have the following problem:
I hava a xampp sever running and I want it to execute a powershell. A php triggers a .bat file which contains the following code:
#echo
cd C:\OpenBR\bin
start /WAIT br -algorithm FaceRecognition -compare C:\xampp\htdocs\upload C:\xampp\htdocs\DP C:\xampp\htdocs\results\result.csv
start /WAIT C:\xampp\htdocs\CSVconvert\sortieren.ps1
start /WAIT C:\xampp\htdocs\CSVconvert\Removedouble.ps1
start /WAIT C:\xampp\htdocs\CSVconvert\remove_path.ps1
start /WAIT C:\xampp\htdocs\CSVconvert\remove_foo.ps1
start C:\xampp\htdocs\CSVconvert\remove_quoatation.ps1
The first part works fine, up until the point when i want to exec the powershell "sortieren.ps1". When I run the batch manually, it executes and does the job, when triggered via php, it doesn't.
I set "Set-ExecutionPolicy Unrestricted" in both x86 and x64 shells.
I am just confused because the normal command line works and powershell doesn't, even after setting it on unrestricted.
I viewed
executing a Powershell script from php
and
PowerShell on Windows 7: Set-ExecutionPolicy for regular users
but couldn't solve the problem.
What did i miss?
The session you are running those commands in doesn't have the same environment variables as when you are using PowerShell to run them manually. You'll have to specify the absolute path to the powershell executeable and the scripts that you want to run so that they will be found.
start /WAIT C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe C:\xampp\htdocs\CSVconvert\sortieren.ps1
Since the problem was the environment i thought you might benefit from a package that handles that aspect automatically. Here is a project that allows PHP to obtain and interact dynamically with a real Powershell. Get it here: https://github.com/merlinthemagic/MTS
After downloading you would simply use the following code:
$shellObj = \MTS\Factories::getDevices()->getLocalHost()->getShell('powershell');
$strCmd1 = 'first command from first script';
$return1 = $shellObj->exeCmd($strCmd1);
$strCmd2 = 'second command from first script';
$return2 = $shellObj->exeCmd($strCmd2);
Instead of triggering a single script you can just trigger each command individually and handle the return. You can issue any command you like against the $shellObj, the environment is maintained throughout the life of the PHP script.
I am using phpseclib to ssh to my server and run a python script. The python script is an infinite loop, so it runs until you stop it. When I execute python script.py via ssh with phpseclib, it works, but the page just loads for ever. It does this because phpseclib does not think it is "done" running the line of code that runs the infinite loop script so it hangs on that line. I have tried using exit and die after that line, but of course, it didnt work because it hangs on the line before, the one that executes the command. Does any one have any ideas on how I can fix this without modifying the python file? Thanks.
Assuming the command will be run by a shell, you could have it execute this to start it:
nohup python myscript.py > /dev/null 2>&1 &
If you put an & on the end of any shell command it will run in the background and return immediately, that's all you really need.
Something else you could have also done:
$ssh->setTimeout(1);