The following cURL configuration works fine on my local machine using cURL 7.30.0:
$curl = curl_init();
curl_setopt_array($curl, array(
// Just showing the noteworthy options here.
CURLOPT_HTTPHEADER => array("Content-Type: application/x-www-form-urlencoded")
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_0,
CURLOPT_COOKIE => "foo=bar",
));
$response = curl_exec($curl);
curl_close($curl);
Excerpt of the debugging output:
> GET / HTTP/1.0
Host: example.com
Accept: */*
Cookie: foo=bar
Content-Type: application/x-www-form-urlencoded
Now I run the same code on a shared hosting environment with cURL 7.19.7 and get:
> GET / HTTP/1.1
Host: example.com
Accept: */*
Content-Type: application/x-www-form-urlencoded
Basically cURL is working 99% fine, but ignores the forced HTTP version and cookie string. Is the hosting company running a configuration that blocks these features? Is the cURL version they are running too old? What's going on here?
I found it. curl_setopt_array quits processing options as soon as one option fails, I didn't know that. I should've checked the return value to make sure all was well.
In my case the culprit was option CURLOPT_FOLLOWLOCATION. It probably failed because the hosting provider is using safe mode, which disables the follow 301/302 feature.
$curl = curl_init();
$check = curl_setopt_array($curl, $options);
if(!$check)
die("Ye be warned: one of your options did not make it.");
$response = curl_exec($curl);
curl_close($curl);
Related
I want to get content of this page by php curl:
my curl sample:
function curll($url,$headers=null){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
if ($headers){
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
}
curl_setopt($ch, CURLOPT_ENCODING, '');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:59.0) Gecko/20100101 Firefox/59.0');
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLINFO_HEADER_OUT, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 60);
$response = curl_exec($ch);
$res['headerout'] = curl_getinfo($ch,CURLINFO_HEADER_OUT);
$res['rescode'] = curl_getinfo($ch, CURLINFO_HTTP_CODE);
if ($response === false) {
$res['content'] = $response;
$res['error'] = array(curl_errno($ch),curl_error($ch));
return $res;
}
$header_size = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
$res['headerin'] = substr($response, 0, $header_size);
$res['content'] = substr($response, $header_size);
return $res;
}
response:
array (size=4)
'headerout' => string 'GET /wallets HTTP/1.1
Host: www.cryptocompare.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:59.0) Gecko/20100101 Firefox/59.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding: br
Accept-Language: en-US,en;q=0.5
Connection: keep-alive
Upgrade-Insecure-Requests: 1
' (length=327)
'rescode' => string '200' (length=3)
'content' => boolean false
'error' =>
array (size=2)
0 => int 23
1 => string 'Unrecognized content encoding type. libcurl understands deflate, gzip content encodings.' (length=88)
response encoding is br and response content is false
I am aware that using gzip or deflate as encoding would get me a content. However, the content that I have in mind is only shown by br encoding.
I read on this page that Curl V7.57.0 supports the Brotli Compression Capability. I currently have version 7.59.0 installed, but Curl encounters an error as it recieves content in br encoding.
now I want to know how can I get content of a page with br encoding and uncompress it by php curl ?
I had the exact same issue because one server was only able to return brotli and my PHP Curl-bundled version didn't support Brotli. I had to use a PHP extension: https://github.com/kjdev/php-ext-brotli
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'URL');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output_brized = curl_exec($ch);
$output_ok = brotli_uncompress($output_brized);
I checked and, with PHP 7.4.9 on Windows with bundled Curl version 7.70.0, setting the CURLOPT_ENCODING option to '' (like you did) forced the bundled Curl to do the request with one additionnal header accept-encoding: deflate, gzip which are the content encodings the bundled Curl can decode. If I omited this option, there was just 2 headers: Host: www.google.com and accept: */*.
Indeed, searching the PHP source code (https://github.com/php/php-src/search?q=CURLOPT_ENCODING) for this CURLOPT_ENCODING option lead to nothing that may set a default value or change value from PHP. PHP sends the option value to Curl without altering it so what I am observing is the default behavior of my bundled Curl version.
I then discovered Curl supports Brotli from version 7.57.0 (https://github.com/curl/curl/blob/bf1571eb6ff24a8299da7da84408da31f0094f66/docs/libcurl/symbols-in-versions) from november 2018 (https://github.com/curl/curl/blob/fd1ce3d4b085e7982975f29904faebf398f66ecd/docs/HISTORY.md) but requires to be compiled with a --with-brotli flag (https://github.com/curl/curl/blob/9325ab2cf98ceca3cf3985313587c94dc1325c81/configure.ac) which was probably not used for my PHP version.
Unfortunately, there is no curl_getopt() function to get the default value of an option. But, phpinfo() gives a valuable info as I got a BROTLI => No line which confirms my version was not compiled with Brotli support. You may want to check your phpinfo to find out if your Curl-bundled version should support Brotli. If it doesn't, use my solution. If it does, more investigation need to be done to find out if it's a bug or a misuse.
If you want to know what your Curl sent, you have to use a proxy like Charles/Fiddler or use Curl verbose mode.
Additionnaly, for the sake of completness, in the HTTP1/1 specs (https://www.rfc-editor.org/rfc/rfc2616#page-102), it's said:
If an Accept-Encoding field is present in a request, and if the
server cannot send a response which is acceptable according to the
Accept-Encoding header, then the server SHOULD send an error response
with the 406 (Not Acceptable) status code.
If no Accept-Encoding field is present in a request, the server MAY
assume that the client will accept any content coding.
So, if your PHP version behaved the same as mine, the website should have received a Accept-Encoding not containing br so should NOT have replied with a br content and, instead, should have replied with a gzip or deflate content or, if it was not able to do so, replied with a 406 Not Acceptable instead of a 200.
if you using cloudflare, then you can try to disable brotli extension from cloudflare.
I am trying to make a REST request to a external webserver by using this code
<?php
$user = 'USER';
$pass = 'PASS';
$data = "MYDATA"
$ch = curl_init('URL');
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_HTTPHEADER, array(
'Content-Type: application/json',
'Content-Length: ' . strlen($data))
);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_VERBOSE, true);
if(!($res = curl_exec($ch))) {
echo('[cURL Failure] ' . curl_error($ch));
}
curl_close($ch);
echo($res);
Now this is a CURL request, however i tried different methods to test my result and they all give me a 403 forbidden error response that i get from the webserver, however i do get a 200 response when i run it on any other webserver (localhost, webserver2, ...) Therefore i think there is something wrong with my webserver and it might be disallowing/caching the post parameters that i provide because sometimes it returns a 200 response but most of the times it returns the 403.
This is the response i get :
HTTP/1.1 403 Forbidden Accept-Ranges: bytes Content-Type: application/json; charset=UTF-8 Date: Sat, 26 Oct 2013 13:56:37 GMT Server: Restlet-Framework/2.1.3 Vary: Accept-Charset, Accept-Encoding, Accept-Language, Accept Content-Length: 77 Connection: keep-alive {"error":"ForbiddenOperationException","errorMessage":"Invalid credentials."}
It says Invalid credentials however i provide the correct credentials and i can confirm them because it is working on other servers.
Since this is a crucial part of my script that i use for clients to register i assume that there is something wrong with the post parameters.
I am running cpanel and uninstalled the following already:
- varnish
- apachebooster
i also recompiled php already and enabled curl and its dependencies but nothing seems to resolve my problem.
If more information is required then don't hesitate to ask me in the comments i will respond very quickly as i really need this.
any help is appreciated.
Kind regards Maxim
Morning all
Basically, I am unable to make successful cURL requests to internal and external servers from my Windows 7 development PC because of an issue involving a proxy server. I'm running cURL 7.21.2 thru PHP 5.3.6 on Apache 2.4.
Here's a most basic request that fails:
<?php
$curl = curl_init('http://www.google.com');
$log_file = fopen(sys_get_temp_dir() . 'curl.log', 'w');
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_VERBOSE => TRUE,
CURLOPT_HEADER => TRUE,
CURLOPT_STDERR => $log_file,
));
$response = curl_exec($curl);
#fclose($log_file);
print "<pre>{$response}";
The following (complete) response is received.
HTTP/1.1 400 Bad Request
Date: Thu, 06 Sep 2012 17:12:58 GMT
Content-Length: 171
Content-Type: text/html
Server: IronPort httpd/1.1
Error response
Error code 400.
Message: Bad Request.
Reason: None.
The log file generated by cURL contains the following.
* About to connect() to proxy usushproxy01.unistudios.com port 7070 (#0)
* Trying 216.178.96.20... * connected
* Connected to usushproxy01.unistudios.com (216.178.96.20) port 7070 (#0)
> GET http://www.google.com HTTP/1.1
Host: www.google.com
Accept: */*
Proxy-Connection: Keep-Alive
< HTTP/1.1 400 Bad Request
< Date: Thu, 06 Sep 2012 17:12:58 GMT
< Content-Length: 171
< Content-Type: text/html
< Server: IronPort httpd/1.1
<
* Connection #0 to host usushproxy01.unistudios.com left intact
Explicitly stating the proxy and user credentials, as in the following, makes no difference: the response is always the same.
<?php
$curl = curl_init('http://www.google.com');
$log_file = fopen(sys_get_temp_dir() . 'curl.log', 'w');
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_VERBOSE => TRUE,
CURLOPT_HEADER => TRUE,
CURLOPT_STDERR => $log_file,
CURLOPT_PROXY => 'http://usushproxy01.unistudios.com:7070',
CURLOPT_PROXYUSERPWD => '<username>:<password>',
));
$response = curl_exec($curl);
#fclose($log_file);
print "<pre>{$response}";
I was surprised to see an absolute URL in the request line ('GET ...'), but I think that's fine when dealing with proxy servers - according to the HTTP spec.
I've tried all sorts of combinations of options - including sending a user-agent, following this and that, etc, etc - having been through Stack Overflow questions, and other sites, but all requests end in the same response.
The same problem occurs if I run the script on the command line, so it can't be an Apache issue, right?
If I make a request using cURL from a Linux box on the same network, I don't experience a problem.
It's the "Bad Request" thing that's puzzling me: what on earth is wrong with my request? Do you have any idea why I may be experiencing this problem? A Windows thing? A bug in the version of PHP/cURL I'm using?
Any help very gratefully received. Many thanks.
You might be looking at an issue between cURL (different versions between Windows and Linux) and your IronPort version. In IronPort documentation:
Fixed: Web Proxy uses the Proxy-Connection header instead of the
Connection header, causing problems with some user agents
Previously, the Web Proxy used the Proxy-Connection header instead of the
Connection header when communicating with user agents with explicit
forward requests. Because of this, some user agents, such as Real
Player, did not work as expected. This no longer occurs. Now, the Web
Proxy replies to the client using the Connection header in addition to
the Proxy-Connection header. [Defect ID: 46515]
Try removing the Proxy-Connection (or add a Connection) header and see whether this solves the problem.
Also, you might want to compare the cURL logs between Windows and Linux hosts.
I'm having this strange error, CURL ERROR: Recv failure: Connection reset by peer
This is how it happens, if I did not connect to the server and all of a sudden trying to connect to the server via CURL in PHP I get the error. When I run the CURL script again the error disappears and then works well the whole time, if I leave the remote server idle for about 30mins or reboot the remote server and try to connect again, I get the error again. So it seems like the connection is idle and then all of sudden the server wakes up and then works and then sleeps again.
This is how my CURL script looks.
$url = Yii::app()->params['pdfUrl'];
$body = 'title='.urlencode($title).'&client_url='.Yii::app()->params['pdfClientURL'].'&client_id='.Yii::app()->params['pdfClientID'].'&content='.urlencode(htmlentities($content));
$c = curl_init ($url);
$body = array(
"client_url"=>Yii::app()->params['pdfClientURL'],
"client_id"=>Yii::app()->params['pdfClientID'],
"title"=>urlencode($title),
"content"=>urlencode($content)
);
foreach($body as $key=>$value) { $body_str .= $key.'='.$value.'&'; }
rtrim($body_str,'&');
curl_setopt ($c, CURLOPT_POST, true);
curl_setopt ($c, CURLOPT_POSTFIELDS, $body_str);
curl_setopt ($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($c, CURLOPT_CONNECTTIMEOUT , 0);
curl_setopt ($c, CURLOPT_TIMEOUT , 20);
$pdf = curl_exec ($c);
$errorCode = curl_getinfo($c, CURLINFO_HTTP_CODE);
$curlInfo = curl_getinfo($c);
$curlError = curl_error($c);
curl_close ($c);
I'm totally out of ideas and solutions, please help, I'll appreciate it!!!
If I verbose the output to see what happens using
curl_setopt ($c, CURLOPT_VERBOSE, TRUE);
curl_setopt($c, CURLOPT_STDERR, $fp);
I get the following
* About to connect() to 196.41.139.168 port 80 (#0)
* Trying 196.x.x.x... * connected
* Connected to 196.x.x.x (196.x.x.x) port 80 (#0)
> POST /serve/?r=pdf/generatePdf HTTP/1.1
Host: 196.x.x.x
Accept: */*
Content-Length: 7115
Content-Type: application/x-www-form-urlencoded
Expect: 100-continue
* Recv failure: Connection reset by peer
* Closing connection #0
012 20:23:49 GMT
< Server: Apache/2.2.15 (CentOS)
< X-Powered-By: PHP/5.3.3
< Connection: close
< Transfer-Encoding: chunked
< Content-Type: text/html; charset=UTF-8
<
* Closing connection #0
I've added in the following toe remove the default header and still no luck:
curl_setopt ($c, CURLOPT_HTTPHEADER, array( 'Expect:' ) );
> Accept: */* Content-Length: 8414 Content-Type:
> application/x-www-form-urlencoded
>
> * Recv failure: Connection reset by peer
> * Closing connection #0 r: Apache/2.2.15 (CentOS) < X-Powered-By: PHP/5.3.3 < Connection: close < Transfer-Encoding: chunked <
> Content-Type: text/html; charset=UTF-8 <
> * Closing connection #0
Introduction
The remote server has sent you a RST packet, which indicates an immediate dropping of the connection, rather than the usual handshake.
Possible Causes
A. TCP/IP
It might be a TCP/IP issue you need to resolve with your host or upgrade your OS most times connection is closed with remote server before it finished downloading the content resulting in Connection reset by peer.....
B. Kernel Bug
Note that there are some issues with TCP window scaling on some Linux kernels after v2.6.17. See the following bug reports for more information:
https://bugs.launchpad.net/ubuntu/+source/linux-source-2.6.17/+bug/59331
https://bugs.launchpad.net/ubuntu/+source/linux-source-2.6.20/+bug/89160
C. PHP & CURL Bug
You are using PHP/5.3.3 which has some serious bugs too ... I would advise you to work with a more recent version of PHP and CURL
https://bugs.php.net/bug.php?id=52828
https://bugs.php.net/bug.php?id=52827
https://bugs.php.net/bug.php?id=52202
https://bugs.php.net/bug.php?id=50410
D. Maximum Transmission Unit
One common cause of this error is that the MTU (Maximum Transmission Unit) size of packets travelling over your network connection has been changed from the default of 1500 bytes.
If you have configured a VPN this most likely must changed during configuration
D. Firewall: iptables
If you don't know your way around these guys they can cause some serious issues .. try and access the server you are connecting to check the following
You have access to port 80 on that server
Example
-A RH-Firewall-1-INPUT -m state --state NEW -m tcp -p tcp --dport 80 -j ACCEPT`
The Following is at the last line not before any other ACCEPT
Example
-A RH-Firewall-1-INPUT -j REJECT --reject-with icmp-host-prohibited
Check for ALL DROP, REJECT and make sure they are not blocking your connection
Temporary allow all connection as see if it foes through
Experiment
Try on a different server or on a remote server ( So many free cloud hosting online) and test the same script. If it works then my guesses are correct ... You need to update your system
Others Code Related
A. SSL
If Yii::app()->params['pdfUrl'] is a url with https not including proper SSL settings can also cause this error in old version of curl
Resolution: Make sure OpenSSL is installed and enabled then add this to your code
curl_setopt($c, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($c, CURLOPT_SSL_VERIFYHOST, false);
Normally this error means that a connection was established with a server but that connection was closed by the remote server. This could be due to a slow server, a problem with the remote server, a network problem, or (maybe) some kind of security error with data being sent to the remote server but I find that unlikely.
Normally a network error will resolve itself given a bit of time, but it sounds like you’ve already given it a bit of time.
cURL sometimes having issue with SSL and SSL certificates.
I think that your Apache and/or PHP was compiled with a recent version of the cURL and cURL SSL libraries plus I don't think that OpenSSL was installed in your web server.
Although I can not be certain However, I believe cURL has historically been flakey with SSL certificates, whereas, Open SSL does not.
Anyways, try installing Open SSL on the server and try again and that should help you get rid of this error.
I faced same error but in a different way.
When you curl a page with a specific SSL protocol.
curl --sslv3 https://example.com
If --sslv3 is not supported by the target server then the error will be
curl: (35) TCP connection reset by peer
With the supported protocol, error will be gone.
curl --tlsv1.2 https://example.com
So what is the URL that Yii::app()->params['pdfUrl'] gives? You say it should be https, but the log shows it's connecting on port 80... which almost no server is setup to accept https connections on. cURL is smart enough to know https should be on port 443... which would suggest that your URL has something wonky in it like: https://196.41.139.168:80/serve/?r=pdf/generatePdf
That's going to cause the connection to be terminated, when the Apache at the other end cannot do https communication with you on that port.
You realize your first $body definition gets replaced when you set $body to an array two lines later? {Probably just an artifact of you trying to solve the problem} You're also not encoding the client_url and client_id values (the former quite possibly containing characters that need escaping!) Oh and you're appending to $body_str without first initializing it.
From your verbose output we can see cURL is adding a content-length header, but... is it correct? I can see some comments out on the internets of that number being wrong (especially with older versions)... if that number was to small (for example) you'd get a connection-reset before all the data is sent. You can manually insert the header:
curl_setopt ($c, CURLOPT_HTTPHEADER,
array("Content-Length: ". strlen($body_str)));
Oh and there's a handy function http_build_query that'll convert an array of name/value pairs into a URL encoded string for you.
All this rolls up into the final code:
$post=http_build_query(array(
"client_url"=>Yii::app()->params['pdfClientURL'],
"client_id"=>Yii::app()->params['pdfClientID'],
"title"=>$title,
"content"=>$content));
//Open to URL
$c=curl_init(Yii::app()->params['pdfUrl']);
//Send post
curl_setopt ($c, CURLOPT_POST, true);
//Optional: [try with/without]
curl_setopt ($c, CURLOPT_HTTPHEADER, array("Content-Length: ".strlen($post)));
curl_setopt ($c, CURLOPT_POSTFIELDS, $post);
curl_setopt ($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($c, CURLOPT_CONNECTTIMEOUT , 0);
curl_setopt ($c, CURLOPT_TIMEOUT , 20);
//Collect result
$pdf = curl_exec ($c);
$curlInfo = curl_getinfo($c);
curl_close($c);
This is a firewall issue, if you are using a VMware application, make sure the firewall on the antivirus is turned off or allowing connections.
If this server is on a secure network, please have a look at firewall rules of the server.
Thanks
Ganesh PNS
In my case there was problem in URL. I've use https://example.com - but they ensure 'www.' - so when i switched to https://www.example.com everything was ok. The proper header was sent 'Host: www.example.com'.
You can try make a request in firefox brwoser, persist it and copy as cURL - that how I've found it.
We had the same issue, in making a websocket connection to the Load Balancer.
The issue is in LB, accepting http connection on port 80 and forwarding the request to node (tomcat app on port 8080).
We have changed this to accept tcp (http has been changed as 'tcp') connection on port 80.
So the first handshake request is forwarded to Node and a websocket connection is made successfully on some random( as far as i know, may be wrong) port.
below command has been used to test the websocket handshake process.
curl -v -i -N -H "Connection: Upgrade" -H "Upgrade: websocket" -H "Host: localhost" -H "Origin: http://LB URL:80" http://LB URL
Rebuilt URL to: http:LB URL/
Trying LB URL...
TCP_NODELAY set
Connected to LB URL (LB URL) port 80 (#0)
GET / HTTP/1.1
Host: localhost
User-Agent: curl/7.60.0
Accept: /
Connection: Upgrade
Upgrade: websocket
Origin: http://LB URL:80
Recv failure: Connection reset by peer
Closing connection 0
curl: (56) Recv failure: Connection reset by peer
For some reason I can't seem to get this particular web page's contents via cURL. I've managed to use cURL to get to the "top level page" contents fine, but the same self-built quick cURL function doesn't seem to work for one of the linked off sub web pages.
Top level page: http://www.deindeal.ch/
A sub page: http://www.deindeal.ch/deals/hotel-cristal-in-nuernberg-30/
My cURL function (in functions.php)
function curl_get($url) {
$ch = curl_init();
$header = array(
'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7',
'Accept-Language: en-us;q=0.8,en;q=0.6'
);
$options = array(
CURLOPT_URL => $url,
CURLOPT_HEADER => 0,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_USERAGENT => 'Mozilla/5.0 (Windows; U; Windows NT 6.1; en; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13',
CURLOPT_HTTPHEADER => $header
);
curl_setopt_array($ch, $options);
$return = curl_exec($ch);
curl_close($ch);
return $return;
}
PHP file to get the contents (using echo for testing)
require "functions.php";
require "phpQuery.php";
echo curl_get('http://www.deindeal.ch/deals/hotel-walliserhof-zermatt-2-naechte-30/');
So far I've attempted the following to get this to work
Ran the file both locally (XAMPP) and remotely (LAMP).
Added in the user-agent and HTTP headers as recommended here file_get_contents and CURL can't open a specific website - before the function curl_get() contained all the options as current, except for CURLOPT_USERAGENTandCURLOPT_HTTPHEADERS`.
Is it possible for a website to completely block requests via cURL or other remote file opening mechanisms, regardless of how much data is supplied to attempt to make a real browser request?
Also, is it possible to diagnose why my requests are turning up with nothing?
Any help answering the above two questions, or editing/making suggestions to get the file's contents, even if through a method different than cURL would be greatly appreciated ;).
Try adding:
CURLOPT_FOLLOWLOCATION => TRUE
to your options.
If you run a simple curl request from the command line (including a -i to see the response headers) then it is pretty easy to see:
$ curl -i 'http://www.deindeal.ch/deals/hotel-cristal-in-nuernberg-30/'
HTTP/1.1 302 FOUND
Date: Fri, 30 Dec 2011 02:42:54 GMT
Server: Apache/2.2.16 (Debian)
Vary: Accept-Language,Cookie,Accept-Encoding
Content-Language: de
Set-Cookie: csrftoken=d127d2de73fb3bd72e8986daeca86711; Domain=www.deindeal.ch; Max-Age=31449600; Path=/
Set-Cookie: generic_cookie=1; Path=/
Set-Cookie: sessionid=987b1a11224ecd0e009175470cf7317b; expires=Fri, 27-Jan-2012 02:42:54 GMT; Max-Age=2419200; Path=/
Location: http://www.deindeal.ch/welcome/?deal_slug=hotel-cristal-in-nuernberg-30
Content-Length: 0
Connection: close
Content-Type: text/html; charset=utf-8
As you can see, it returns a 302 with a Location header. If you hit that location directly, you will get the content you are looking for.
And to answer your two questions:
No, it is not possile to block requests from something like curl. If the consumer can talk HTTP then it can get to anything the browser can get to.
Diagnosing with an HTTP proxy could have been helpful for you. Wireshark, fiddler, charles, et al. should help you out in the future. Or, do like I did and make a request from the command line.
EDIT
Ah, I see what you are talking about now. So, when you go to that link for the first time you are redirected and a cookie (or cookies) is set. Once you have those cookie, your request goes through as intended.
So, you need to use a cookiejar, like in this example: http://icfun.blogspot.com/2009/04/php-how-to-use-cookie-jar-with-curl.html
So, you will need to make an initial request, save the cookies, and make your subsequent requests including the cookies after that.