Huy guys,
I need to transfer a larger amount of files from one server to another as a sort of "update/re-install process" for the application I'm building.
So far the files have been pushed by a main server via FTP. This works well, but I want to get rid of storing client's FTP information and want to turn the push-method into a pull-method. So the client clicks "Update" and the client server receives the files.
I've been looking into Phar, Zip and other ways of packing files, but they require extensions and I want my application to be at least extension-dependent as possible.
So I've resorted to transfering the files with JSON. The main/source server packs all the files in a JSON array and sends it to the client server upon request, and the client server loops over the files and saves them. It works perfectly well for PHP,JavaScript, etc. but some images are corrupted in the process.
I suspect it is due to the fact that the data is transfered as ASCII rather than binary, because I encountered the same problem when I built the installation with FTP, and when I turned to binary transfer instead of ASCII the images were no longer corrupted.
Does anybody here have a solution to getting the images transfered non-corrupted?
I use file_get_contents, and have used it in other projects to open and save image data, so I know the function can handle it. I suspect the JSON will need to do some additional encoding or something like that to correctly transfer the image content?
Thanks in advance
Try with base64.That is the simplest way to transfer binary data with php.
Related
So I am simply seeking to upload files to the server for later upload to Azure Blob Storage.
I am aware you can POST data using a HTML form, however this isn't a solution.
I would like to be able to upload files directly using POST such as through cURL or other means. As far as I'm aware PHP also requires the content type header to be specified as multipart/form-data for this.
I have tried to use file handles, but it appears that I am only able to interact with locally stored files. I have also played around with move_uploaded_files, but I am unable to POST files directly.
Streaming is important here, as files will likely be 5MB+, which wouldn't work for multiple concurrent uploads as PHP would quickly saturate its allocated memory; while I can change PHP settings on the staging server, I have no idea what the PHP memory allocation will be like on the Azure VM production server. It also wouldn't make sense to keep changing PHP settings with load increases.
However, if PHP does not offer such a solution out of the box, I'm open to alternatives.
To summarise, how do I stream data (e.g. videos) to a PHP file?
Thank you in advance.
I have a website, in which users can upload mp3 files (uploadify), stream them using an html5 player (jplayer) and download them using a php script (www.zubrag.com/scripts/).
When a user uploads a song, the path to the audio file is saved in the database and i'm using that data in order to play and show a download link for the song.
The problem that i'm experiencing is that, according to my host, this method is using a lot of memory on the server, which is dedicated.
Link to script: http://pastebin.com/Vus8SRa7
How should I handle the script properly? And what would be the best way to track down the problem? Any ideas on cleaning up the code?
Any help much appreciated.
I would recommend storing your files on disk (named something random [check for collisions!] or sequential, without file extension, and outside of the doc root), and only store information in your DB. It's much easier to stream a file from disk this way than it is out of a database result.
When you retrieve an entire file's contents out of a database result, that data has to be in memory. readfile() doesn't have this issue. Use headers to return the original file name when sending the file back to the client, if you wish.
I would suggest you not to buffer the content when you are writing binary data of the MP3 onto your HTTP output. That way you'd be saving a lot on physical and virtual memory usage.
(I have a Linux server and My computer's operating sysyem is Windows.)
Hi. I've heard that you should transfer text files like PHP files via ASCII transfer type, and for other types of files such as images, you should use binary transfer type. Otherwise you can get errors.
But I had once encountered with an error after I transferred some PHP files through ASCII transfer mode. So I tried binary transfer instead then it resolved the problem right away.
Being curious, I downloaded some PHP files and uploaded them again using Filezilla. (both time through binary transfer) And they caused no problems at all.
Question :
Was I just lucky or is it totally okay to transfer PHP files using binary transfer mode?
Many Thanks in advance.
You can use binary transfer any time; it transfers a byte-for-byte identical copy. You should use ASCII for all text-based files.
You state you "had a problem" uploading a PHP script using ASCII, but I find it hard to believe that was the cause. Can you elaborate?
Folks
I have an image at some server (SOURCE)
i.e. http://stagging-school-images.s3.amazonaws.com/2274928daf974332ed4e69fddc7a342e.jpg
Now I want to upload it to somewhere else (DESTINATION)
i.e. example.mysite.com/receiveImage.php
First, I am copying image from source to my local server and then uploading it to destination.
It's perfectly working but taking too much time as it copy the image and then uploads...
I want to make it more simple and optimized by directly uploading image from source URL to destination URL.
Is there a way to handle this ?
I am using php/cURL to handle my current functionality.
Any help would be very much appreciated.
Cheers !!
If example.mysite.com/receiveImage.php is your own service, then you may
pass SOURCE URL to your PHP script as GET or POST parameter
in PHP script, use file_get_contents() function to obtain image by URL, and save it to your storage
Otherwise it's impossible by means of HTTP.
However, there are some ways to increase files uploading speed a little:
If files are huge, you may use two threads: one for downloading (it will store all downloaded data to some buffer) and one for uploading (it will get all available data from buffer and upload it to site). As far as I know, this can't be done easily with PHP, because multi-threading is currently not supported yet.
If there are too many files, you may use many threads / processes, which will do download/upload simultaneously.
By the way, these means do not eliminate double traffic for your intermediate service.
One of the services may have a form somewhere that will allow you to specify a URL to receive from/send to, but there is no generic HTTP mechanism for doing so.
copy($url, $uploadDir.'/'.$fileName);
The only way to transfer the image directly from source to destination is to initiate the transfer from either the source or the destination. You can't magically beam the data between these two locations without them talking to each other directly. If you can SSH login to your mysite.com server you could download the image directly from there. You could also write a script that runs on mysite.com and directly downloads the image from the source.
If that's not possible, the best alternative may be to play around with fread/fwrite instead of curl. This should allow you to read a little bit from the source, then directly upload that bit to the destination so download and upload can work in parallel. For huge files this should make a real difference, for small files on a decent connection it probably won't.
create two textfield one url, other filename
in php, use :
uploadDir is path to your file directory ;)
copy($url, $uploadDir.'/'.$fileName);
we are creating a website for hotel booking. we need to store a large number of images. we think it would be a better option to store images in the filesystem and store the path in the database. But do we have to manually save them? We are using web services from another website to get the images. is there a way to save the images dynamically in the filesystem??
You can use PHP's file get contents function or CURL to download all the images you want to the disk or simply refering the foreign image to your clients and you won't need to store them locally on the server.
If you like Python check the Mecanize lib and BeautifulSoup to parse XML if you need.
Storing in disk vs storing in database has it's beneficts. If you need to scale, it's easier and you can always have a lighttpd or a nginx http servers dedicated to images or simply put it out on other server to balance bandwidth.
It depends on the database, and how you are serving up the images. In general it is better to save the images to disk, depending on how you are delivering them to the client.
Getting the images is usually a matter of some process on the server downloading them from websites and saving them. On many systems you could use wget or curl to download the images and save them.
It also depends on how you are getting the data. If it is some inline binary via XML or something, then you will need to extract that using the features of your application language, and save it to disk.
The mechanics of how to do that vary wildly depending on the implementation language and the hosting operating system.
I think that you should store both remote (web service) and local (filesystem) location in the database, with initial file system location blank. If a user requests an image for the first time, download it, update the file field and show it. With this concept you will only have images your clients need.