Server side caching of PHP readfile images - php

For SEO reasons I am sending some images using PHP readfile and URL rewrite. I save images in a folder with numerical id (id.jpg) but serve the image with keywords (some-seo-word-id.jpg). While the scripts runs smoothly and show images effectively, however in pages with multiple images (5 to 6) it fails to send all images correctly.
Refreshing the pages 2 to 3 times sometimes shows all images, sometimes it does not ! Since the max image size delivered using PHP readfile is 70-140 KB each I highly doubt that memory can be an issue. My question is
Is there any better approach to this only with Apache rewrite.
How does Facebook deliver all images effectively with php ? There url are something like this https://www.facebook.com/photo.php?fbid=XXXXXXXXX27&set=a.4533609xxxx.xxxx099.xxxx210426&type=1&theater
will server side caching in binary format help ? To me this should not affect much as readfile of an image or cached file will still read a particular file (cached file instead of image !)

Related

Bulk image compression on server via php

I am developing a content management kind of thing where user can upload their content with images and other stuff on php
Some user have uploaded image larger size may be a 1MB image for logo of page.
I want to compress all the images present in the folder containing images of all users recursively.
So how that can be done using php or any server site scripting.
Note : I dont have a shell access to server to run any script. The only way to do it using something in php

PHP Compress large file make server can not respond

My site have much music and picture resource.
User can login my site and download those resource.
When user click 'download' button , PHP will compress those selected files, then users will download the compressed file(zip).
The problem is:
When PHP compress files, Other users visit to my site would have been
waiting for a response until the compression is completed
So:
how to make PHP compression , while users can also visit my website
fast?
First reply to Marc B and crafter. By compressing files to zip archive you can send multiple files at once. This options is often on file and image hostings. For example this is the way you can download all photos from Google+ album.
Now, about problem. First thing to do is to set compression level to zero since compression is not what you really want. Be setting compression to zero you'll send (here i simplify a little) concatenated files with ZIP header in the beginning. If you you use some sort of outpout buffering, you'd better disable it for this functionality.
Also if you use Apache you may consider mugration to NGINX+PHP-FPM. It deals much better with such tasks.

Only allow real visitors, and block custom "parsing" bots through PHP?

I have a very large database of all items in a massive online game on my website, and so do my competitors. I however, am the only site to have pictures of all these items. All these pictures are on my server, eg. from 1.png to 99999.png (all in the same directory).
It's very easy for my competitors to create a simple file_get_contents/file_put_contents script to just parse all of this images to their own server and redistribute them their own way. Is there anything I can do about this?
Is there a way to limit (for example) everyone to only see/load 100 images per minute (I'm sure those scripts would rapidly parse all of the images)? Or even better, only allow real users to visit the URL's? I'm sure those scripts wont listen to a robots.txt file, so what would be a better solution? Does anybody have an idea?
Place a watermark in your images that states that the images are copyrighted by you or your company. Your competitors would have to remove the watermark and make the image look like there never was one, so that would definitely be a good measure to take.
If you're using Apache Web Server, create an image folder and upload an htaccess file that tells the server that only you and the server are allowed to see the files. This will help hide the images from the parsing bots, as Apache would see that they are not authorized to see what's in the folder. You'd need to have PHP load the images (not just pass img tags on) so that as far as the permissions system is concerned, the server is accessing the raw files.
On your PHP page itself, use a CAPTCHA device or some other robot detection method.

Mp3 streaming/downloading website - apache server memory issue

I have a website, in which users can upload mp3 files (uploadify), stream them using an html5 player (jplayer) and download them using a php script (www.zubrag.com/scripts/).
When a user uploads a song, the path to the audio file is saved in the database and i'm using that data in order to play and show a download link for the song.
The problem that i'm experiencing is that, according to my host, this method is using a lot of memory on the server, which is dedicated.
Link to script: http://pastebin.com/Vus8SRa7
How should I handle the script properly? And what would be the best way to track down the problem? Any ideas on cleaning up the code?
Any help much appreciated.
I would recommend storing your files on disk (named something random [check for collisions!] or sequential, without file extension, and outside of the doc root), and only store information in your DB. It's much easier to stream a file from disk this way than it is out of a database result.
When you retrieve an entire file's contents out of a database result, that data has to be in memory. readfile() doesn't have this issue. Use headers to return the original file name when sending the file back to the client, if you wish.
I would suggest you not to buffer the content when you are writing binary data of the MP3 onto your HTTP output. That way you'd be saving a lot on physical and virtual memory usage.

How can I upload an image from source URL to some destination URL?

Folks
I have an image at some server (SOURCE)
i.e. http://stagging-school-images.s3.amazonaws.com/2274928daf974332ed4e69fddc7a342e.jpg
Now I want to upload it to somewhere else (DESTINATION)
i.e. example.mysite.com/receiveImage.php
First, I am copying image from source to my local server and then uploading it to destination.
It's perfectly working but taking too much time as it copy the image and then uploads...
I want to make it more simple and optimized by directly uploading image from source URL to destination URL.
Is there a way to handle this ?
I am using php/cURL to handle my current functionality.
Any help would be very much appreciated.
Cheers !!
If example.mysite.com/receiveImage.php is your own service, then you may
pass SOURCE URL to your PHP script as GET or POST parameter
in PHP script, use file_get_contents() function to obtain image by URL, and save it to your storage
Otherwise it's impossible by means of HTTP.
However, there are some ways to increase files uploading speed a little:
If files are huge, you may use two threads: one for downloading (it will store all downloaded data to some buffer) and one for uploading (it will get all available data from buffer and upload it to site). As far as I know, this can't be done easily with PHP, because multi-threading is currently not supported yet.
If there are too many files, you may use many threads / processes, which will do download/upload simultaneously.
By the way, these means do not eliminate double traffic for your intermediate service.
One of the services may have a form somewhere that will allow you to specify a URL to receive from/send to, but there is no generic HTTP mechanism for doing so.
copy($url, $uploadDir.'/'.$fileName);
The only way to transfer the image directly from source to destination is to initiate the transfer from either the source or the destination. You can't magically beam the data between these two locations without them talking to each other directly. If you can SSH login to your mysite.com server you could download the image directly from there. You could also write a script that runs on mysite.com and directly downloads the image from the source.
If that's not possible, the best alternative may be to play around with fread/fwrite instead of curl. This should allow you to read a little bit from the source, then directly upload that bit to the destination so download and upload can work in parallel. For huge files this should make a real difference, for small files on a decent connection it probably won't.
create two textfield one url, other filename
in php, use :
uploadDir is path to your file directory ;)
copy($url, $uploadDir.'/'.$fileName);

Categories