PHP fopen not working on one particular domain - php

I'm trying to download file from remote url with fopen function.
Problem it's function return false from one website that i need. From other domains functions works fine.
How could it be? Maybe have some options in php? Or that website can protect file from some access(but from browser file available)?

There are a number of checks the server side can do to prevent "miss usage" of their service. One example is a check of the "HTTP Referer Header" which indicates that your request is done by a browser navigating from a link to the object.
You can simulate all that if you want to, but for that you have to find out exactly what the difference is between your request and one the browser successfully makes. Two things to do for that:
find out the exact error message you receive back. Easiest for that is to use php's cURL extension instead of file_open() for your request, it allows you to dump everything you get back. There might be valuable information like a reason in the reply.
monitor both requests by means of a network sniffer, for example tcpdump or wireshark. The comparison of both requests allows to tell the exact difference. That again is the information you need to precisely rebuilt the browsers request in your script.

On some shared hosting or some VPS fopen not work or are disabled inside PHP. Try to use CURL to get contnt. If that not work, the last solution (only if you send some informations via GET but not to recive data) is to use <img> tag and inside "src" to send request. That work ONLY if you send informations, but if you need to recive something, you need to use or AJAX or cURL.

Related

Can servers block curl requests specifically?

Generally speaking, is it possible for a server to block a PHP cURL request?
I've been making cURL requests every 15 minutes to a certain public-facing URL for about 6-8 months. Suddenly the other day it stopped working, and the URL started returning an empty string.
When I hit the URL in a browser or with a python get request, it returns the expected data.
I decided to try hitting the same URL with a file_get_contents() function in PHP, and that works as expected as well.
Since I found a bandaid solution for now, is there any difference between the default headers that cURL sends vs file_get_contents() that would allow one request to be blocked and the other to get through?
Generally speaking, is it possible for a server to block a PHP cURL
request?
Sort of. The server can block requests if your user agent string looks like it comes from curl. Try using the -A option to set a custom user agent string.
curl -A "Foo/1.1" <url>
Edit: Oops I see you said "from PHP", so just set the CURLOPT_USERAGENT option:
curl_setopt($curl, CURLOPT_USERAGENT, 'Foo/1.1');
A lot of websites block you based on user agent. Best workaround that I can think of is to simply open up your developer console in Chrome, and click on network tab. Go to the URL of the website that you are trying to access and find the request that gets data that you need. Right click on that request and copy it as cURL. It will have all the headers that your browser is sending.
If you add all of those headers, to your cURL request in php, web-server will not be able to tell the difference between request from your curl and your browser's.
You will need to update those headers once every couple years (some websites try to forbid old versions of firefox or chrome which bots have been abusing for years).
Forget curl. Think about it from the perspective of an HTTP request. All the server sees is that. If your curl request contains something (user agent header for instance) that the server can use to filter out requests, it can use this to reject those requests.

Selenium 2 WebDriver: How to verify that an image request has been received and fulfilled?

Is there a good way to check that a GET request for an image, something like https://api.google.com/v1/__x.gif, has been received & fulfilled in Selenium 2 WebDriver with php?
Initially I thought that I could make an XHR request and alert() the responseText, using assertEquals to compare my expected string to the actual output. Quickly realized this wasn't going to work, since I wanted to see the page's network requests that I'm testing.
After more research, I found two very different possibilites:
First being captureNetworkTraffic (pending response from Sauce Labs support to see if this is possible):
The second option (which I don't completely understand) would be setting up a proxy server.
I'm new to stackoverflow and a beginner when it comes to server requests. Thank you for the help in advance!
Option 1 and 2 are the same; you use a proxy to capture network traffic.
When creating a driver instance in Webdriver, you have the ability to set a proxy. This is a server and port through which the browser will direct all network traffic. Proxies can do many things such as creating mock responses, manipulating requests etc, but in your case, you want the proxy to record the request made, forward on to the required server, record the response, and return response back to browser.
If you use a proxy like Browsermob, you can interrogate the requests during the test run as the proxy has an API (e.g get me the latest request the browser made and assert it was a POST)
There appears to be a PHP library to wrap interaction with the Browsermob instance https://packagist.org/packages/chartjes/php-browsermob-proxy
So, in your test;
Start proxy
Create driver using the proxy setting
Go to required page
Assert that request was made in Browsermob
Of course, the other simpler approach could be to get the image src url from the html via seleniuml, then make a GET request in the test using a http client. If it returns an image, then you can say that the url from the ing tag works , and that may be good enough for your testing.

PHP readfile/fgets/fread causes multiple requests to server

I'm trying to stream MP4 files through Apache / Nginx using a PHP proxy for authentication. I've implemented byte-ranges to stream for iOS as outlined here: http://mobiforge.com/developing/story/content-delivery-mobile-devices. This works perfectly fine in Chrome and Safari but.... the really odd thing is that if I monitor the server requests to the php page, three of them occur per page load in a browser. Here's a screen shot of Chrome's inspector (going directly to the PHP proxy page):
As you can see, the first one gets canceled, the second remains pending, and the third works. Again, the file plays in the browser. I've tried alternate methods of reading the file (readfile, fgets, fread, etc) with the same results. What is causing these three requests and how can I get a single working request?
The first request is for the first range of bytes, preloading the file. The browser cancels the request once it has downloaded the specified amount.
The second one I'm not sure about...
The third one is when you actually start playing the media file, it requests and downloads the full thing.
Not sure whether this answers your question, but serving large binary files with PHP isn't the right thing to do.
It's better to let PHP handle authentication only and pass the file reference to the web server to serve, freeing up resources.
See also: Caching HTTP responses when they are dynamically created by PHP
It describes in more detail what I would recommend to do.

PHP redirect file post stream

I'm writing an API using php to wrap a website functionality and returning everything in json\xml. I've been using curl and so far it's working great.
The website has a standard file upload Post that accepts file(s) up to 1GB.
So the problem is how to redirect the file upload stream to the correspondent website?
I could download the file and after that upload it, but I'm limited by my server to just 20MG. And it seems a poor solution.
Is it even possible to control the stream and redirect it directly to the website?
I preserverd original at the bottom for posterity, but as it turns out, there is a way to do this
What you need to use is a combination of HTTP put method (which unfortunately isn't available in native browser forms), the PHP php://input wrapper, and a streaming php Socket. This gets around several limitations - PHP disallows php://input for post data, but it does nothing with regards to PUT filedata - clever!
If you're going to attempt this with apache, you're going to need mod_actions installed an activated. You're also going to need to specify a PUT script with the Script directive in your virtualhost/.htaccess.
http://blog.haraldkraft.de/2009/08/invalid-command-script-in-apache-configuration/
This allows put methods only for one url endpoint. This is the file that will open the socket and forward its data elsewhere. In my example case below, this is just index.php
I've prepared a boilerplate example of this using the python requests module as the client sending the put request with an image file. If you run the remote_server.py it will open a service that just listens on a port and awaits the forwarded message from php. put.py sends the actual put request to PHP. You're going to need to set the hosts put.py and index.php to the ones you define in your virtual host depending on your setup.
Running put.py will open the included image file, send it to your php virtual host, which will, in turn, open a socket and stream the received data to the python pseudo-service and print it to stdout in the terminal. Streaming PHP forwarder!
There's nothing stopping you from using any remote service that listens on a TCP port in the same way, in another language entirely. The client could be rewritten the same way, so long as it can send a PUT request.
The complete example is here:
https://github.com/DeaconDesperado/php-streamer
I actually had a lot of fun with this problem. Please let me know how it works and we can patch it together.
Begin original answer
There is no native way in php to pass a file asynchronously as it comes in with the request body without saving its state down to disc in some manner. This means you are hard bound by the memory limit on your server (20MB). The manner in which the $_FILES superglobal is initialized after the request is received depends upon this, as it will attempt to migrate that multipart data to a tmp directory.
Something similar can be acheived with the use of sockets, as this will circumvent the HTTP protocol at least, but if the file is passed in the HTTP request, php is still going to attempt to save it statefully in memory before it does anything at all with it. You'd have the tail end of the process set up with no practical way of getting that far.
There is the Stream library comes close, but still relies on reading the file out of memory on the server side - it's got to already be there.
What you are describing is a little bit outside of the HTTP protocol, especially since the request body is so large. HTTP is a request/response based mechanism, and one depends upon the other... it's very difficult to accomplish a in-place, streaming upload at an intermediary point since this would imply some protocol that uploads while the bits are streamed in.
One might argue this is more a limitation of HTTP than PHP, and since PHP is designed expressedly with the HTTP protocol in mind, you are moving about outside its comfort zone.
Deployments like this are regularly attempted with high success using other scripting languages (Twisted in Python for example, lot of people are getting onboard with NodeJS for its concurrent design patter, there are alternatives in Ruby or Java that I know much less about.)

How do I get using php?

I know that this is a simple question for PHP guys but I don't know the language and just need to do a simple "get" from another web page when my page is hit. i.e. signal the other page that this page has been hit.
EDIT: curl is not available to me.
If curl wrappers are on (they are per default), you can use:
file_get_contents('http://www.example.org');
Note that this happens synchronous, so before the request has completed, your page won't either. It would be better to log access to a logfile (or database) and export the data occasionally. Alternatively, you could do the request after your page has completed, and output has been sent to the client.
Beware file_get_contents() and fopen():
If PHP has decided that filename specifies a registered protocol, and that protocol is registered as a network URL, PHP will check to make sure that allow_url_fopen is enabled. If it is switched off, PHP will emit a warning and the fopen call will fail.
There's numerous ways... the simplest is file_get_contents('http://...');
Other functions like fopen() also support http: streams.
If you need more control, I'd suggest looking at curl (see curl_init)
There are a number of ways to send a GET request with PHP. As mentioned above, you can use file_get_contents, fopen or cURL.
You can also use the HTTP extension, the fsockopen() function or streams via fopen.
I'd advise checking out what WordPress has done, as it handles almost all possibilities.
You'll probably want to use cURL

Categories