I'm running PHP CLI commands inside an Ubuntu VM via SSH. The problem is that certain commands have started causing segmentation faults.
Oddly, it only happens when the command is run via SSH. If I run it directly inside the VM, there's no problem. Also, it doesn't happen on another machine running the same VM.
When I run the PHP command (via SSH from the host machine), the only thing I see in the logs on the VM is this:
Mar 22 14:53:00 local kernel: [ 1868.863475] php[4967]: segfault at 7ff60c200000 ip 00007ff6202b284e sp 00007ffce7f19448 error 4 in libc-2.23.so[7ff620213000+1c0000]
This seems to indicate that the problem is actually not with PHP, but with the standard library. If that's the case, how can I go about debugging this, and/or how should I submit a bug report?
Details, if they're relevant:
My host machine is also Ubuntu, and the VM is DrupalVM using the beet/box base box. The command I'm actually running is blt setup (see https://github/acquia/blt), which calls drush cex inside the VM via SSH.
It's this drush cex command that recently started causing the seg fault.
First option:
strace mycommand
...and watch what happens
Second option
gdb mycommand
...and step through it
Third option
gather the dumps and analyze them.
- https://stackoverflow.com/questions/17965/how-to-generate-a-core-dump-in-linux-on-a-segmentation-fault
The first one is usually sufficient if you have some intuition about what's going on. In your case it's probably something silly like terminal capability negotioation or some other nor-really-core-relevant thing, but you never know.... could be serious.
Second option is the most complicated.
Related
I have an XML database that I want to manage independently from users on my website. Looking into the matter it appears that I should write a daemon script to manage my database. That is all fine and dandy but, I feel like I'm opening a can of worms. I wanted to write my daemon script in PHP, so I looked into PCNTL. But I quickly learned that PCNTL is not suited for web servers. So now I am stumped. How can I get a daemon to run on my server? Do I need to learn another language? I only want to write my own scripts. But I feel lost. I would prefer to write my daemon in PHP as I am familiar with the language.
I have been researching everything from PCNTL, CLI, SO questions, numerous articles on daemon processes... etc
I am running PHP 5.6.32 (cli), windows 7, on Apache. XAMPP 5.6.32. Unix system.
EDIT: I also have windows setup to run PHP from command prompt.
There's nothing wrong in running a PHP daemon, however it's not the fastest thing, especially before the 7.0 version. You can proceed in two ways:
Using Cron Jobs, if you're under Unix systems crontab will be fine, in this way you can specify the interval within the system automatically executes the specified script and then exit.
The true daemon, firstly you need to change the max_execution_time in PHP.ini to 0 (infinite), then in your daemon call for first function set_time_limit(0);, remember to run it only once. However if there is some failure like a thrown error uncatched the script will exit and you need to open it again manually, and don't try...catch in a while loop because it will probably go into an endless loop. Execute the script with php -f daemon.php.
I get a strange PHP bug on a PHP 5.6 / Symfony 2.7 project, running on a CentOS6 server through Apache.
I have a Symfony console command running as a service which launches some other console commands every 2 seconds. I use the Symfony Process component to launch the sub-processes and have timeout management.
And everything is done to avoid to launch parallel processes from the main command.
The issue I have is that sometimes php console commands doesn't stop after finishing their process. Which means that if I launch by hand the commands, everything runs correctly on the PHP side but I don't get the hand back on the console after the PHP statements finished, unless I use Ctrl+C.
The issue was happening a lot of times when the PHP version was 5.5, but now with PHP 5.6 it (only) happens randomly. When it happens, I can see a lot of stucked php sub-processes, probably launched by the main command.
I just can't find any explanation since php commands doesn't raise any error. It's just that the console get stuck and wait for something to finish.
Has anybody a possible solution to this issue?
I'm looking for some advice.
Rignt now i've got a bunch of php scripts that i've scheduled through cron. They run on my local machine doing stuff like pulling stuff out of a mysql db and sending automated emails. To run them I just have something like this in crontab: 0 7 * * 1 /usr/bin/php /phpscripts/script.php
I need to migrate all of those scripts to a Windows machine. I'm planning to use the Windows Task Scheduler to run the scripts, but how can I run the actual php scripts locally? From what I understand you need something like xampp to run the apache server? I guess what I need is a Windows equivalent of /usr/bin/php in crontab.
Installing PHP
You don't have to install xammp, you can install PHP alone, have a look ate the windows PHP installation guide:
Windows Installer (PHP 5.1.0 and earlier)
Windows Installer (PHP 5.2 and later)
Manual Installation Steps
If you prefer installing XAMP, you can run PHP script after locating the php.exe with the -f flag:
C:\Xampp\php\php.exe -f C:\Xampp\htdocs\my_script.php
Running the PHP file
After you have PHP installed, check Command Line PHP on Microsoft Windows manual for information on how to run the script. On the page there is explanation of how to make the php file executable, so you could run it as:
"C:\PHP Scripts\script" -arg1 -arg2 -arg3
Make sure you are using an administrative account to run the command. Otherwise you might have permissions problems. more info at the Introduction to using PHP on the command line
Scheduling the task
Go to Start -> Programs -> Accessories -> System Tools -> Scheduled Tasks,
Right-click on an empty spot in the Scheduled Task window and select New -> Scheduled Task (Also accessible via File -> New -> Scheduled Task)
Name the new task (How about "Bill"? He looks like a Bill, doesn't he? "Mr. B. Evolution, II" It sounds so regal.)
Double-click the new task to open the properties window (or File -> Properties)
Under the Task tab, enter the same command that you used to test the script above. For instance, I would enter:
C:\PHP\php.exe "C:\Inetpub\wwwroot\blogs\cron\cron_exec.php"
Go to the Schedule tab and enter when and how often the task should run. The schedule defaults to run once daily and should be fine for basic usage, but feel free to tweak as needed.
The rest of the fields can be left as-is, unless you're an ace and know what you're doing.
Click OK and we're done!
for more info have a look at setting up a window scheduled task.
Set up your task to run when you want it (times and all that)
and pop this into the command:
C:\Path\to\php.exe -f "C:\Path\to\file.php"
Edit: you can also set a second php.ini to be run used when the CLI is used to run a file, which has no constraints on max execution time and the like. Very handy difference and better suited to running (potentially) long execution scripts.
You can do this by creating a php-cgi.ini file in your PHP folder where your php.ini file resides. This will be used automatically when a PHP file is executed from the CLI (this is how scheduled tasks are run).
Also note that Windows Scheduler will simply end on an error that causes your script to fall over, so running some extra logging might be a good idea in case your scripts exit early.
I got the 140dev Twitter framework (which uses the Twitter phirehose) manually
running (via the webbrowser on my local wamp server), but I can't
figure out how to run both get_tweets.php and parse_tweets.php as a
background process like with SSH commands:
nohup php script.php > /dev/null &
Some of you started using (the Windows equivalent of) cronjobs, but
this isn't the right way to go. I think this is because of creating
multiple connection (or re-connections) to the Twitter streaming phirehose isn't allowed?
How can I run both PHP scripts (get_tweets.php and parse_tweets.php)
as a background process on my local WAMP server (and later on a VPS)?
Just to clearify:
I am using a WAMP server (first to test a little bit and later to
run it on a VPS)
Using LAMP or any *nix server/system isn't an option (due to time,
experience and lack of skills)
I have searched for solutions (on google and stackoverflow), but they are either not working or not clear enough for me (I am new to this)
Thank you in advance.
Find the php/bin folder where the php.exe is located. Copy the folder path and add it to your PATH environment variable (Follow this for instance to edit your PATH variable.
Once this is done, you'll be able to execute php in the command line from anywhere. Just start php script.php with a command line in the right folder and it should work. There might be some configuration to make so that the php in command line uses WAMP's php.ini.
OS: ubuntu 11.10
Webserver: Apache
Code: PHP
Hello I am trying to "exec" a C code through PHP web page. When I run the same C code directly on terminal, it works fine, but when I "exec" it through PHP, I get a segmentation fault.
Any idea why such behavior? My C code is doing small "malloc"s at a few places. The code never the less works fine on directly executing through terminal using ./a.out
Is there a way for me to gdb the C code, when PHP tries to execute it?
Thanks
Most likely it is a user permissions error. Your web server will run as a different user (nobody, wwwrun or similar). Try doing an su to the web server user, and running the C program as that user.
Hard to tell without actually seeing the code. Are you sure your program doesn't leak? Are you able to add some debug console output to see when/where it crashes? Does your program try to access any ressources (like files, ports, etc.)? Are there sufficient rights for the web server's user (or whoever runs the php script) to actually execute it properly?
What you could do is enabling core dump file creation and read the core dump into gdb after the exe crash. To enable core dump creation see what 'ulimit' does.
BTW: One possible reason for your program crash can be uninitialized variables, in particular pointer variables.