as my client needs, I developed a code to login via cURl.
login to www.web1.com and store cookies in cookie.txt
go to www.web2.com and browse a page using that cookie.txt
no problem with www.web2.com
so when i want to do this with www.web3.com, the problem appears.
the www.web3.com uses session and cookies itself and I have to gather and use them.
it means I should have tow series of cookies, first those from www.web1.com , and second those from www.web3.com , then request the www.web3.com/somepage
how I can do that?
You can execute a command line call to curl from php to save cookies to a file like so:
curl -c '/tmp/mycookies.txt' 'http://www.site.com/login.php
Then use those cookies when submiting to the page like so:
curl -b '/tmp/mycookies.txt' -d 'uname=MyLoginName&pass=MyPassword&action=login&x=67&y=11' 'http://www.site.com/login.php'
For more info about these command line flags:
http://curl.haxx.se/docs/manpage.html
You can user the following line to get the cookie informations:
curl -k -s -d'user=foo&pass=bar' -D- https://server1.com/login/ -o/dev/null -f
Use shell_exec or exec to run this command. After getting the header information you can parse the cookie information. Use a helper class or write your own parser -> http://framework.zend.com/manual/en/zend.http.cookies.html (Zend_Http_Cookie::fromString)
You can store this information in a session and not in a text file. For web3.com grab also the cookie information and save it in the session or the cookie.txt file.
Related
I need to download several zip files from this web page ....
http://www.geoportale.regione.lombardia.it/download-pacchetti?p_p_id=dwnpackageportlet_WAR_geoportaledownloadportlet&p_p_lifecycle=0&metadataid=%7B16C07895-B75B-466A-B980-940ECA207F64%7D
using curl or wget, so not in interactive way,
A sample url is the follow ...
http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12
If I use this link in a new browser tab or window, all works fine but using curl or wget it's not possible to download the zipfile.
Trying to see what happen in the browser using Firebug, or in general the browser console, I can see that there is first a POST request and then a GET request (using Firebug ... ), so I'm not able to reproduce these requests using curl or wget.
Could be also that some cookies are sets in the browser session and the links do not work without that cookie?
Any suggestion will be appreciated ....
Cesare
NOTE: when I try to use a wget this is my result
NOTE 2: 404 Not found
NOTE 3 (the solution): the right command is
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
then I've to rename the file in something like "pippo.zip" and this is my result, or, better using the -O option in this manner
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12" -O pippo.zip
Looking at your command, you're missing the double quotes. Your command should be:
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
That should download it properly.
I have an api to export some information to a csv file. The API is correct and it is downloading my file when I access it from the browser. I need to access this API from the terminal and download the file without having to go to the browser.
My route for the API looks like this:
Route::get('/api/file/export', 'File\FileController#export', [
'middleware'=>'auth.basic'
]);
I tried using curl like this:
curl --user email:password http://example.com/api/file/export
I have tried different curl commands but each of then displays the redirect to login html. When I use -O the command for downloading a file, it downloads a file that has the redirect to login link.
curl --user email:password -O http://example.com/api/file/export
Am I calling the API correctly? How else can I access the API from the terminal?
You should first be logged in your website. You can try this:
curl --user email:password http://domain.tld/login_page
And then use the cookies for your second request:
curl --cookie http://domain.tld/file/to/export
If that is not working, you need to do the whole submit action with cURL, meaning doing POST request with email and password etc.
Someone gave a good solution here
PS: Checkout if you don't need a token to request your API too.
I'm trying to get CURL to work with a Wordpress installation that requires a user to be logged in to see the content. I've exported the cookies from my logged in session and saved them in a cookie jar file. When I run curl, I just get redirected to a login screen. Any idea why?
The command I am running is:
curl --cookie cookie_jar.txt -L <url>
You should make two requests, as the cookie in the cookie jar file that verifies authentication may become stale/expire. The first request should be to the script that handles authentication, which will be the action script that the login page's form sends the POST request with necessary authentication variables.
curl --cookie cookie_jar.txt --data "username=someguy&press=mahpass" -L http://mywordpressblog.com/login.php
Now, any subsequent requests using the same cookie jar will send the cookies already in the cookie jar that match the cookie domain:
curl --cookie cookie_jar.txt -L http://mywordpressblog.com/page-i-wanted-all-along.php
I am working with a php page with multiple forms. I enter information into the first form and click submit. Then it takes me to the second form where I check "I agree" and hit submit. Both of these forms are on the same php page however.
I can get both curl and wget to act on the first form but not the second. I tried saving the cookie and loading it but neither seems to work. Here is my code for both:
curl -k --cookie-jar cookie.txt -d "login=$user&pass=$pass" https://member.testurl.com/login.php
curl -k --cookie cookie.txt -d "i_agree=1&do_agreement=1&data=$data" https://member.testurl.com/login.php
wget --save-cookies testcookie.txt --keep-session-cookies --post-data "login=$user&pass=$pass" https://member.testurl.com/login.php
wget --load-cookies testcookie.txt --post-data "i_agree=1&do_agreement=1&data=$data" https://member.testurl.com/login.php
As I said, the cookie from the first form appears to save as there is data in the cookie file but I am unable to get to the second form. The second part of either command seems to try and post to the first form again instead of the second. Is the cookie not loading? Something peculiar with the page?
I'm trying to script bash to do a few simple curl commands.
I can generate a cookie that I want to use with the following command:
curl -c cookie --data "user=user&password=pass" //example/login.php
However, when I try passing it into the site again the cookie is ignored. It's as if I didn't even login into the first place. The command I'm using is the following:
curl -b cookie //example/
What am I doing wrong?
Any help would be appreciated thanks.
it turns out I wasn't generating my cookie correctly after all.
I was actually missing some additional variables that were required in my POST statement, which interacted with the php log in script I was using.
You may want to store the cookie that comes back. You do so by specifying a cookie file:
curl -c cookies.txt -d "user=user&password=pass" //example/login.php
and to use those cookie in later requests you do:
curl -b cookies.txt //example/