Cronjobs on Google Sitemap for Opencart - Php-cli command not found - php

Opencart generates its sitemap on the fly and this is a problem in a big catalogs over 10.000 products. So I have modified the function to generate a static sitemap in an XML file.
When I access to my http://localhost/index.php?route=extension/feed/google_sitemap I generate a sitemap-google.xml file without problems and with a unlimited execution time.
I tried to add it in a cron in the development server each 4 hours
0 0,4,8,12,16,20 /usr/bin/php /path/to/index.php?route=extension/feed/google_sitemap
But I'm receiving a "command not found".
Can I execute on cli the "?params/etc"?

You cannot do that, as the URL parameters are only evaluated this way when calling the script over a server. But a quick solution could be to use wget: keep a copy of that sitemap script anywhere under some kind of "secret URL", call it using wget and put the result on your disk.
If you cannot use wget, you could use a PHP script containing file_get_contents. In the same way, it could request the data over a HTTP request and save it in the cached sitemap file.
As a note: if you know which logic should be present to generate that sitemap, you could also write all that logic directly to a PHP script. Running it from shell helps to avoid a blocked server thread, but might be more work

https://stackoverflow.com/a/62145786/4843247
You can use following command to generate sitemap.xml:
cd /PATH_TO_YOUR_WWW_ROOT && QUERY_STRING="route=extension/feed/google_sitemap" php -r 'parse_str($_SERVER["QUERY_STRING"],$_GET);include "index.php";' >sitemap.xml.new && mv sitemap.xml.new sitemap.xml

Related

I'm trying to embedded batch into php, but it is not executing

So, I'm trying to setup a batch executable inside a website (php in this case), so it would download certain file directly to desired directory, without need for user to interact with it. Basically the plan is if there was a website with mods/in-game builds/worlds for a game, you'd want to download them directly into AppData, and not bother with moving it from Downloads manually.
I am using Xampp localhost to test run it (and I did run it as admin).
I searched up online to find how to embed batch inside php, and got to this:
<?php
exec("cd %AppData% && curl <LINK> -o <NAME>.<FILE_SUFFIX>");
?>
I tried with 'system' instead of 'exec', adding 'cmd /c' in front of the command as well, but not working either
I tried a different approach after that, just to test
<?php
exec("start batch.bat");
?>
with this code
#echo
cd %AppData%
curl <LINK> -o <NAME>.<FILE_SUFFIX>
pause
Which resulted in
'curl' is not recognized as an internal or external command, operable program or batch file.
I also tried absolute path instead of relative, but no positive result either.
Now I don't know what else to try and what could be causing this. If there is another viable option to achieve what I've stated above, please do let me know as well.
So here's the working code I am using now
<?php
exec("bitsadmin /transfer myDownloadJob /download /priority high <LINK> <TARGET_LOCATION_FILE>");
?>
I still don't know why the curl didn't work, but as bitsadmin is native windows command, it's better anyway. Thanks to everyone who helped!

getting Could not open input file when trying to call php script from shell_exec

I am trying to build a small custom task scheduler. Basically the idea is I have cron run my process script, which looks in the database and finds any scheduled tasks that are ready to run, and runs them. So I think the best way to do this would be to try to launch the tasks "in the background" by way of shell_exec and using > /dev/null, which I understand makes it so the initial script (the process script) doesn't wait for the task scripts to complete.
So first, if there is a better way to achieve this, I'm open to suggestions. Though note I am on php 5.3 so there may be some options in 5.4 and up that I don't have access to :(
However here's the question at hand:
I am testing on WAMP on my windows machine and I am trying to make a call that looks like this:
shell_exec("php $path$base_url$querystring > output_test.txt 2>&1 &");
$path is the full windows path to the script
$base_url is the base url of the script I am calling
$querystring is of course the query string being passed to the task script
I am also outputting to output_test.txt which creates such file in same directory, where I get the following error:
Could not open input file:
C:\xampp\htdocs\email\batch_email_send_u2u.php?dealer=7
Yes I realize the path references an xampp installation, but that is not the issue - all the wamp files are executing from there and everything else has worked like this for years - it was just set up this way to support a legacy setup.
It seems to me shell_exec is locating and running php, it's just that it can't open the referenced script. Can't figure out why.
Also I need to eventually get this working on a real linux server so any advice on how to make that happen would be greatly appreciated!
Found a solution! Special thanks to dan08 for getting me set on the right path.
Ultimately I found the answer in this thread: Pass variable to php script running from command line
I ended up using the argv[] array as described in that post and with a little tweak to the script I'm calling it works like a champ now.

curl not executing properly when invoked by php

So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.

Identify which PHP script is running?

I have a large PHP application and I'm looking for a way to know which PHP script is running at a given moment. Something like when you run "top" on a Linux command line but for PHP.
Are you trying to do so from within the PHP application, or outside of it? If you're inside the PHP code, entering debug_print_backtrace(); at that point will show you the 'tree' of PHP files that were included to get you at that point.
If you're outside the PHP script, you can only see the one process that called the original PHP script (index.php or whatnot), unless the application spawns parallel threads as part of its execution.
If you're looking for this information at the system level, e.g. all php files running under any Apache child process, or even any PHP files in use by other apps, there is the lsof program (list open files), which will spit out by default ALL open files on the system (executables, sockets, fifos, .so's, etc...). You can grep the output for '.php' and get a pretty complete picture of what's in use at that moment.
This old post shows a way you can wrap your calls to php scripts and get a PID for each process.
Does PHP have threading?
$cmd = 'nohup nice -n 10 /usr/bin/php -c /path/to/php.ini -f /path/to/php/file.php action=generate var1_id=23 var2_id=35 gen_id=535 > /path/to/log/file.log & echo $!';
$pid = shell_exec($cmd);

Call URL with wget and return an ERRORLEVEL depending on URL's contents

A client has a Windows based in-house server, on which they edit the contents of a CMS. The data are synchronized nightly with the live web server. This is a workaround for a slow Internet connection.
There are two things to be synchronized: New files (already sorted) and a mySQL database. To do this, I am writing a script that exports the database into a dump file using mysqldump, and uploads the dump.
The upload process is done using a 3rd party tool named ScriptFTP, an FTP automation tool.
I then need to run a PHP based import script on the target server. Depending on this script's return value, the ScriptFTP operation goes on, and some directories are renamed.
I need an external tool for this, as scriptFTP only supports FTP calls. I was thinking about the Windows version of wget.
Within scriptFTP, I can execute any batch or exe file, but I can only parse the errorlevel resulting from the call and not the stdout output. This means that I need to return errorlevel 1 if the PHP import operation goes wrong, and errorlevel 0 if it goes well. Additionally, obviously, I need to return a positive errorlevel if the connection to the import script could not be made at all.
I have total control over the importing PHP script, and can decide what it does on error: Output an error message, return a header, whatever.
How would you go about running wget (or any other tool to kick off the server side import) and returning a certain error level depending on what the PHP script returns?
My best bet right now is building a batch file that executes the wget command, stores the result in a file, and the batch file returning errorlevel 0 or 1 depending on the file's contents. But I don't really know how to match a file's contents using batch programming.
You can do the following in powershell:
$a = wget --quiet -O - www.google.com
$rc = $a.CompareTo("Your magic string")
exit $rc

Categories