I have many XML-s and I downloaded using file or file_get_content, but the server administrator told me that through GZIP is more efficient the downloading. My question is how can I include GZIP, because I never did this before, so this solution is really new for me.
You shouldn't need to do any decoding yourself if you use cURL. Just use the basic cURL example code, with the CURLOPT_ENCODING option set to "", and it will automatically request the file using gzip encoding, if the server supports it, and decode it.
Or, if you want the content in a string instead of in a file:
$ch = curl_init("http://www.example.com/");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_ENCODING, ""); // accept any supported encoding
$content = curl_exec($ch);
curl_close($ch);
I've tested this, and it indeed downloads the content in gzipped format and decodes it automatically.
(Also, you should probably include some error handling.)
I don't understand your question.
You say that you downloaded these files - you can't unilaterally enable compression client-side.
OTOH you can control it server-side - and since you've flagged the question as PHP, and it doesn't make any sense for your administrator to recommend compression where you don't have control over the server then I assume this is what you are talking about.
In which case you'd simply do something like:
<?php
ob_start("ob_gzhandler");
...your code for generating the XML goes here
...or maybe this is nothing to do with PHP, and the XML files are static - in which case you'd need to configure your webserver to compress on the fly.
Unless you mean that compression is available on the server and you are fetching data over HTTP using PHP as the client - in which case the server will only compress the data if the client provides an "Accept-Encoding" request header including "gzip". In which case, instead of file_get_contents() you might use:
function gzip_get_contents($url)
{
$ch=curl_init($url);
curl_setopt($ch, CURLOPT_ENCODING, 'gzip');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$content=curl_exec($ch);
curl_close($ch);
return $content;
}
probably curl can get a gzipped file
http://www.php.net/curl
try to use this instead of file_get_contents
edit: tested
curl_setopt($c,CURLOPT_ENCODING,'gzip');
then:
gzdecode($responseContent);
Send a Accept-Encoding: gzip header in your http request and then uncompress the result as shown here:
http://de2.php.net/gzdecode
Related
I am accessing a remote .mp4 video using curl. I can display it in the browser, but the functions do not work correctly. The total minutes of the video do not appear and I can not advance or rewind the video.
What is responsible for these functions? The header? How can I make it work in curl?
My current script is this:
<?php
header('Content-Type: video/mp4');
header('Content-Disposition: filename="video.mp4"');
$url = 'http://www.exemple.com/video.mp4';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
curl_exec($ch);
curl_close($ch);
?>
You need to add the header
Content-Length
So the client knows how much of the data is remaining.
Also make sure your script acts as a stream, not that you read the whole file and print it back. Google for php output buffering. I don't think it causes your issue but it improves performance and server health tho.
I've created a simple php file which reads a local json file and use the data to response to some queries sent to the page.
I'm using file_get_contents() to read the file. but as the requests count grown, I faced some performance issues because of concurrent connections to the file.
Is it possible to use require() or include() to read and parse the json file?
I've found some solution that fits, I'm now using curl to get the file contents like this:
$curlSession = curl_init();
curl_setopt($curlSession, CURLOPT_URL, 'http://mysite/file.json');
curl_setopt($curlSession, CURLOPT_BINARYTRANSFER, true);
curl_setopt($curlSession, CURLOPT_RETURNTRANSFER, true);
$jsonDataString = curl_exec($curlSession);
curl_close($curlSession);
Yes, you can use require()
ob_start();
ob_implicit_flush(false);
require('test.json');
$json = json_decode(ob_get_clean() , true );
but I think its not faster than file_get_contents()
The best way for performance is to load json-file to browser directly (e.g. by jquery) without php-parsing.
I wrote function below:
function download_xfs($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_COOKIE, 'login=michael; xfss=08ruiweu4tuhb5xqs8');
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$string = curl_exec ($ch);
curl_close ($ch);
// Set headers
}
When I use download_xfs("http://address.com/file.html"); it returns binary of file instead of actual file. Can anyone re-write this code to handle file for download?
Your error is not in downloading a file, PNG is propably downloaded OK, but you propably just echo'ing it to screen. You must set HTTP header proper for this file mime type:
header( 'Content-type: image/png' );
But remember do do not echo any other strings/characters before and after image contents (absolutelly nothing, even invisible BOM info about UTF created by Windows notepad on a begining of a php file). If you want to include a image downloadable by your server in your HTML page, you must create another .php with this header and image contents and include it by:
<img src="image_from_download.php">
if you want to receive plain text, but have a unexpected binary
There can be some unknown solution on remote serverer, that sends something different or redirects you to a image dependly on user agent.
Try to set user agent in cUrl to same as your original browser.
Also, make sure that content served by remote server is not gzipped or chunked (i don't know, that curl unzip it automatically?) by sending proper Accept header in HTTP request.
This problem can be also state-depended, maybe you need some cookies set or be logged in remote web application from where you are retreiving .html
Try to turn off your FOLLOWLOCATION, maybe you received a 30x redirect to a image.
If I know flv video,
for example:
www.domain.com/video.flv
I can't use "filesize" because it doesn't able to check size from external url, right?
So, how can I know his size?
You can use CURL to get the headers of a remote file, including the size of a file.
$url = 'http://www.domain.com/video.flv';
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$headers = curl_exec($ch);
$size = curl_getinfo($ch, CURLINFO_CONTENT_LENGTH_DOWNLOAD);
curl_close($ch);
Setting CURLOPT_NOBODY to true, makes CURL not download the body of a requested URL.
You could use CURL to send a HTTP HEAD request to the URL you want to know the file size of. The server should send back a content-length header in its response, listing the filesize in octets (bytes).
The server isn't necessarially going to actually send the content-length header though. If it doesn't, then your only option is to actually download the file in full.
I can't use "filesize" because it doesn't able to check size from external url, right?
Yes it can do that, but it's a bit ridiculous to do so, because the url wrappers will first download the file and then check its size.
You could do a HEAD request, which will ask the server for the file size without requesting the file data.
Relevant snippet here:
http://snippets.dzone.com/posts/show/1207
I have been researching this for a while and have not been find an answer for this.
I have a Client Site making calls to our API Server. What I would like to transfer an image to the Client Site when a special call is made.
I have some code that downloads the image from the server, but this is causing us to make multiple calls forcing us to create all these images in the server that we don't want to keep, even if we delete them afterward.
$originalFileUrl = createImage('createImage', $fileName);
downloadImage($originalFileUrl, $fileDestination);
deleteFileFromServer('deleteImage', $fileName);
function serverCall ($action, $fileName) {
$serverCall = $SERVER.'/api.php?fileName=' . $fileName . '&action=' . $action;
ob_start();
$ch = curl_init();
$timeout = 5;
curl_setopt ($ch, CURLOPT_URL, $serverCall);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 0);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_exec($ch);
$fileContents = ob_get_contents();
curl_close($ch);
ob_end_clean();
return $fileContents;
}
function downloadImage ($originalFileUrl, $fileDestination) {
// Starting output buffering
ob_start();
// create a new CURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $originalFileUrl);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// set timeouts
set_time_limit(30); // set time in secods for PHP
curl_setopt($ch, CURLOPT_TIMEOUT, 30); // and also for CURL
// open a stream for writing
$outFile = fopen($fileDestination, 'wb');
curl_setopt($ch, CURLOPT_FILE, $outFile);
// grab file from URL
curl_exec($ch);
fclose($outFile);
// close CURL resource, and free up system resources
curl_close($ch);
ob_end_clean();
}
Where $originalFileUrl is the current location of the file, and $fileDestination is the path to where I want my new file to be.
My question is: Can I make a call to a PHP file in the Server that will be in charge of create, transfer and delete the image all in one call rather than doing multiple calls?
Also for multiple reasons ftp the file from the server to the client is not a good option.
Thank you
This will not be a trivial task. However, you should be able to design a successful approach. This won't be the most error-safe method of accomplishing the task, though. You're thinking right now of a HTTP-esque stateless protocol, which is manageable. If the description below doesn't sound good enough, consider another protocol which can maintain a constant bi-directional connection (like an SSH tunnel).
You'd likely suffer data overhead, but that would generally be more than acceptable in order to save multiple calls. To that end, I'd advise creating an XML interface. On the receiving end, your XML would have an element with either a Base64 representation of the image, or possibly a gzipped CDATA implementation. You don't have to stick to any XML standard, but if you do, the PHP XML Parser could help with some of the legwork.
So, to recap, in this model, the server end could receive a set of commands which do what you've called out: move the file into a processing folder, create a Base64 string of the file contents, craft the XMl package, and return it. The client will send a request, and process the response. If the client detects an error, it could retry and the server can still grab the file data from the processing queue.
If error becomes an issue and an open socket isn't a good option (because the coding is difficult), you could also develop a delete-batching system, where you track the files in the processing folder and only delete them on request. But, you'd only make delete requests from the client every once in a while, and possibly not as a part of any particular page with a user experience, but from a cron.