Download a zip file using curl or wget - php

I need to download several zip files from this web page ....
http://www.geoportale.regione.lombardia.it/download-pacchetti?p_p_id=dwnpackageportlet_WAR_geoportaledownloadportlet&p_p_lifecycle=0&metadataid=%7B16C07895-B75B-466A-B980-940ECA207F64%7D
using curl or wget, so not in interactive way,
A sample url is the follow ...
http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12
If I use this link in a new browser tab or window, all works fine but using curl or wget it's not possible to download the zipfile.
Trying to see what happen in the browser using Firebug, or in general the browser console, I can see that there is first a POST request and then a GET request (using Firebug ... ), so I'm not able to reproduce these requests using curl or wget.
Could be also that some cookies are sets in the browser session and the links do not work without that cookie?
Any suggestion will be appreciated ....
Cesare
NOTE: when I try to use a wget this is my result
NOTE 2: 404 Not found
NOTE 3 (the solution): the right command is
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
then I've to rename the file in something like "pippo.zip" and this is my result, or, better using the -O option in this manner
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12" -O pippo.zip

Looking at your command, you're missing the double quotes. Your command should be:
wget "http://www.geoportale.regione.lombardia.it/rlregis_download‌​/service/package?dbI‌​d=323&cod=12"
That should download it properly.

Related

Cronjobs on Google Sitemap for Opencart - Php-cli command not found

Opencart generates its sitemap on the fly and this is a problem in a big catalogs over 10.000 products. So I have modified the function to generate a static sitemap in an XML file.
When I access to my http://localhost/index.php?route=extension/feed/google_sitemap I generate a sitemap-google.xml file without problems and with a unlimited execution time.
I tried to add it in a cron in the development server each 4 hours
0 0,4,8,12,16,20 /usr/bin/php /path/to/index.php?route=extension/feed/google_sitemap
But I'm receiving a "command not found".
Can I execute on cli the "?params/etc"?
You cannot do that, as the URL parameters are only evaluated this way when calling the script over a server. But a quick solution could be to use wget: keep a copy of that sitemap script anywhere under some kind of "secret URL", call it using wget and put the result on your disk.
If you cannot use wget, you could use a PHP script containing file_get_contents. In the same way, it could request the data over a HTTP request and save it in the cached sitemap file.
As a note: if you know which logic should be present to generate that sitemap, you could also write all that logic directly to a PHP script. Running it from shell helps to avoid a blocked server thread, but might be more work
https://stackoverflow.com/a/62145786/4843247
You can use following command to generate sitemap.xml:
cd /PATH_TO_YOUR_WWW_ROOT && QUERY_STRING="route=extension/feed/google_sitemap" php -r 'parse_str($_SERVER["QUERY_STRING"],$_GET);include "index.php";' >sitemap.xml.new && mv sitemap.xml.new sitemap.xml

I'm trying to embedded batch into php, but it is not executing

So, I'm trying to setup a batch executable inside a website (php in this case), so it would download certain file directly to desired directory, without need for user to interact with it. Basically the plan is if there was a website with mods/in-game builds/worlds for a game, you'd want to download them directly into AppData, and not bother with moving it from Downloads manually.
I am using Xampp localhost to test run it (and I did run it as admin).
I searched up online to find how to embed batch inside php, and got to this:
<?php
exec("cd %AppData% && curl <LINK> -o <NAME>.<FILE_SUFFIX>");
?>
I tried with 'system' instead of 'exec', adding 'cmd /c' in front of the command as well, but not working either
I tried a different approach after that, just to test
<?php
exec("start batch.bat");
?>
with this code
#echo
cd %AppData%
curl <LINK> -o <NAME>.<FILE_SUFFIX>
pause
Which resulted in
'curl' is not recognized as an internal or external command, operable program or batch file.
I also tried absolute path instead of relative, but no positive result either.
Now I don't know what else to try and what could be causing this. If there is another viable option to achieve what I've stated above, please do let me know as well.
So here's the working code I am using now
<?php
exec("bitsadmin /transfer myDownloadJob /download /priority high <LINK> <TARGET_LOCATION_FILE>");
?>
I still don't know why the curl didn't work, but as bitsadmin is native windows command, it's better anyway. Thanks to everyone who helped!

Post file from Command Prompt

I am working on a batch-script that makes a SQL query and saves it to a file. The file will then be handled by PHP. Is it possible to POST a file from Windows CMD to a PHP site so it can be handled by php with $_FILES['someFile']?
Yes, you can use curl for this.
curl -F someFile=#localfile.sql http://example.org/upload
Or you can use wget.
wget --post-file=file.jpg http://yourdomain.com/target.php

How to pass POST parameters to Chromium via the command line?

I use chromium --ingognito www.mysite.com/page.php?msg=mymessage to open my website and pass it a msg.
I wish to know how to pass the same msg param via POST instead to use GET, from command line.
Do you do anything with the site in Chromium after opening it? Otherwise you could use a more capable command line http client like curl(1) which would make this very easy.
See this example:
curl --data "param1=value1&param2=value2" http://example.com/resource.cgi
With console ? I don't know, but you can try to use this extension : Advanced REST Client.
The web developers helper program to create and test custom HTTP requests.
Here is the link : https://chrome.google.com/webstore/detail/advanced-rest-client/hgmloofddffdnphfgcellkdfbfbjeloo

how do i get ALL images form a remote server using php/cURL

There are many examples on how to get a single image, but what is the PHP way of getting ALL images
Please read: http://curl.haxx.se/docs/faq.html#Can_I_do_recursive_fetches_with
You can probably call wget through php and do something similar like this.
wget -A png,jpeg,jpg -r http://www.yoursite.com &
This will spawn an asynchronous operation and download all file endings listed with the -A option.
I found a webbot example in a book once.
Sample code :
http://www.schrenk.com/nostarch/webbots/scripts/reader.php?show=image_capture_bot.php
You can download the LIB_download_images.php required by visiting the link below :
http://www.schrenk.com/nostarch/webbots/DSP_download.php
Good luck

Categories