cURL able to make cookie but not use it - php

I'm trying to script bash to do a few simple curl commands.
I can generate a cookie that I want to use with the following command:
curl -c cookie --data "user=user&password=pass" //example/login.php
However, when I try passing it into the site again the cookie is ignored. It's as if I didn't even login into the first place. The command I'm using is the following:
curl -b cookie //example/
What am I doing wrong?
Any help would be appreciated thanks.

it turns out I wasn't generating my cookie correctly after all.
I was actually missing some additional variables that were required in my POST statement, which interacted with the php log in script I was using.

You may want to store the cookie that comes back. You do so by specifying a cookie file:
curl -c cookies.txt -d "user=user&password=pass" //example/login.php
and to use those cookie in later requests you do:
curl -b cookies.txt //example/

Related

Download a zip file using curl or wget

I need to download several zip files from this web page ....
http://www.geoportale.regione.lombardia.it/download-pacchetti?p_p_id=dwnpackageportlet_WAR_geoportaledownloadportlet&p_p_lifecycle=0&metadataid=%7B16C07895-B75B-466A-B980-940ECA207F64%7D
using curl or wget, so not in interactive way,
A sample url is the follow ...
http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12
If I use this link in a new browser tab or window, all works fine but using curl or wget it's not possible to download the zipfile.
Trying to see what happen in the browser using Firebug, or in general the browser console, I can see that there is first a POST request and then a GET request (using Firebug ... ), so I'm not able to reproduce these requests using curl or wget.
Could be also that some cookies are sets in the browser session and the links do not work without that cookie?
Any suggestion will be appreciated ....
Cesare
NOTE: when I try to use a wget this is my result
NOTE 2: 404 Not found
NOTE 3 (the solution): the right command is
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
then I've to rename the file in something like "pippo.zip" and this is my result, or, better using the -O option in this manner
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12" -O pippo.zip
Looking at your command, you're missing the double quotes. Your command should be:
wget "http://www.geoportale.regione.lombardia.it/rlregis_download‌​/service/package?dbI‌​d=323&cod=12"
That should download it properly.

following url: curl couldn't resolve host 'GET'

I have a page (realized with a php framework) that add records in a MySQL db in this way:
www.mysite.ext/controller/addRecord.php?id=number
that add a row in a table with the number id passed via post and other informations such as timestamp, etc.
So, I movedo my eintire web applicazione to another domain and all HTTP requests works fine from old to new domain.
Only remaining issue is the curl: I wrote a bash script (under linux) that run curl of this link. Now, obviously it does not works because curl returns an alert message in which I read the page was moved.
Ok, I edited the curl sintax in this way
#! /bin/sh
link="www.myoldsite.ext/controlloer/addRecord.php?id=number"
curl --request -L GET $link
I add -L to follow url in new location but curl returns the error I wrote in this topic title.
It would be easier if I could directly modify the link adding the new domain but I do not have physical access to all devices.
GET is the default request type for curl. And that's not the way to set it.
curl -X GET ...
That is the way to set GET as the method keyword that curl uses.
It should be noted that curl selects which methods to use on its own depending on what action to ask for. -d will do POST, -I will do HEAD and so on. If you use the --request / -X option you can change the method keyword curl selects, but you will not modify curl's behavior. This means that if you for example use -d "data" to do a POST, you can modify the method to a PROPFIND with -X and curl will still think it sends a POST. You can change the normal GET to a POST method by simply adding -X POST in a command line like:
curl -X POST http://example.org/
... but curl will still think and act as if it sent a GET so it won't send any request body etc.
More here: http://curl.haxx.se/docs/httpscripting.html#More_on_changed_methods
Again, that's not necessary. Are you sure the link is correct?

wget or curl with two forms on same php page

I am working with a php page with multiple forms. I enter information into the first form and click submit. Then it takes me to the second form where I check "I agree" and hit submit. Both of these forms are on the same php page however.
I can get both curl and wget to act on the first form but not the second. I tried saving the cookie and loading it but neither seems to work. Here is my code for both:
curl -k --cookie-jar cookie.txt -d "login=$user&pass=$pass" https://member.testurl.com/login.php
curl -k --cookie cookie.txt -d "i_agree=1&do_agreement=1&data=$data" https://member.testurl.com/login.php
wget --save-cookies testcookie.txt --keep-session-cookies --post-data "login=$user&pass=$pass" https://member.testurl.com/login.php
wget --load-cookies testcookie.txt --post-data "i_agree=1&do_agreement=1&data=$data" https://member.testurl.com/login.php
As I said, the cookie from the first form appears to save as there is data in the cookie file but I am unable to get to the second form. The second part of either command seems to try and post to the first form again instead of the second. Is the cookie not loading? Something peculiar with the page?

Use Session in Cronjob (Crontab)

Hi is it possible to use sessions in a cronjob?
The Script I use is:
session_start();
if(empty($_SESSION['startwert'])){$startwert = 0;}
else {$startwert = $_SESSION['startwert'];}
if(empty($_SESSION['zielwert'])){$zielwert = 10000;}
else {$zielwert = $_SESSION['zielwert'];}
....
$_SESSION['startwert'] = $zielwert;
$_SESSION['zielwert'] = $zielwert + 10000;
echo "Startwert: ".$_SESSION['startwert']."<br>";
echo "Zielwert: ".$_SESSION['zielwert']."<br>";
But the Cron allways start set "startwert" to 10000 and "zielwert" to 20000 and it does not increase the values.
Ok now I have tried this.
/usr/bin/wget -O - http://mydomain.com/script.php
But the cron starts allways with 10000 and 20000. Any ideas?
If you're invoking the PHP script from cron via wget, use the --save-cookies option; if via curl, use --cookie-jar. (If you're invoking the PHP script via php -f [...] or similar, then you'll first need to invoke it via wget or curl instead.)
For example:
wget --load-cookies /tmp/cron-session-cookies --save-cookies /tmp/cron-session-cookies --keep-session-cookies [...]
or
curl -b --cookie-jar /tmp/cron-session-cookies [...]
wget by default doesn't save session cookies, which you want it to do, hence the --keep-session-cookies option; curl by default does save them, so all that's necessary is -b to enable cookies and --cookie-jar to tell curl where to find them. In either case, replace the [...] with whatever options and arguments you're already passing to the program, and adjust the location of the cookie jar file to taste.
Not really. PHP sessions are dependent on cookies (ignoring trans-sid mode), which really only exist in an HTTP context. cron jobs are running in CLI mode, so there's no http layer to deal with.
You CAN force a CLI script to use a particular session file by setting the session ID before calling session_start();, but there's no guaranteed that particular ID would actually exist when the cron job starts, as some other PHP instance's session garbage collector may have deleted it.

curl cookie problem in PHP

as my client needs, I developed a code to login via cURl.
login to www.web1.com and store cookies in cookie.txt
go to www.web2.com and browse a page using that cookie.txt
no problem with www.web2.com
so when i want to do this with www.web3.com, the problem appears.
the www.web3.com uses session and cookies itself and I have to gather and use them.
it means I should have tow series of cookies, first those from www.web1.com , and second those from www.web3.com , then request the www.web3.com/somepage
how I can do that?
You can execute a command line call to curl from php to save cookies to a file like so:
curl -c '/tmp/mycookies.txt' 'http://www.site.com/login.php
Then use those cookies when submiting to the page like so:
curl -b '/tmp/mycookies.txt' -d 'uname=MyLoginName&pass=MyPassword&action=login&x=67&y=11' 'http://www.site.com/login.php'
For more info about these command line flags:
http://curl.haxx.se/docs/manpage.html
You can user the following line to get the cookie informations:
curl -k -s -d'user=foo&pass=bar' -D- https://server1.com/login/ -o/dev/null -f
Use shell_exec or exec to run this command. After getting the header information you can parse the cookie information. Use a helper class or write your own parser -> http://framework.zend.com/manual/en/zend.http.cookies.html (Zend_Http_Cookie::fromString)
You can store this information in a session and not in a text file. For web3.com grab also the cookie information and save it in the session or the cookie.txt file.

Categories