Options for client side zip file extraction before upload - php

Our problem
We are building a browser based application that allows a user to upload files to a server for processing. The server side is written in PHP using the Codeigniter v2.0.2 framework. The files to be uploaded are the output of another system and their format is out of our control. They can be very large (100's of MB) but much of their content is not needed for the server side processing. The files themselves are actually zip format (albeit with a bespoke extension) and containing a lot of image files together with a relatively small XML file and it is only this we need for the server side processing. Obviously it would be pretty wasteful to send the entire file when we need less than 1% of it's mass. We also don't want to ask users to manually extract the XML from the file.
My question
What are our options for writing client side code that can extract the XML file from the zip and send it? Happy to consider any technologies that run in most modern browsers. Whilst we are a C#/C++ coding house, web technologies are not our day-to-day so code examples gratefully received!
Many thanks.

Are you looking for a library for zip compression? You can use SLSharpZipLib on the client side and its .NET counterpart on the server side.

This Should be easily doable with a signed java applet, 90% users will have java installed,if not client can install it.
Signed Java applet will be able to access the file and extract the file you need and transfer it to your server.
This will also be quite fast so even large files will work just fine.

I used this lib by Phil Sturgeon. My files were under 20mb so I can't tell how it will work with your project.

Related

Get Server to Accept Multipart Uploads for Large Files with Abort/Resume Functionality like S3

Amazon S3 has a very nice feature that allows the upload of files in parts for larger files. This would be very useful to me if I was using S3, but I am not.
Here's my problem: I am going to have Android phones uploading reasonably large files (~50MB each of binary data) on a semi-regular basis. Because these phones are using the mobile network to do this, and the coverage is spotty in some of the places where they're being used, strong signal cannot be guaranteed. Therefore, doing a simple PUT with 40MB of data in the content body will not work very well. I need to split up the data somehow (probably into 10MB chunks) and upload them whenever the signal will allow it. Once all of the chunks have been uploaded, they need to be merged into a single file.
I have a basic understanding of how the client needs to behave to support this through reading Amazon's S3 Client APIs, but have no idea what the server is doing to allow this. I'm willing to write the server in Python or PHP. Are there any libraries out there for either language to allow this sort of thing? I couldn't find anything after about one hour of searching.
Basically, I'm looking for anything that can help point me in the right direction. Information on this and what protocols and headers to use to make this as RESTful as possible would be fantastic. Thanks!
From the REST API documentation for multi-part upload it seems that Amazon expects the client to break the large file into smaller multiple parts and upload them individually. Prior to uploading you need to obtain an upload id and on every upload you include the upload id and the a part number for the portion of the file being uploaded.
The way you may have to go about structuring is to create a client which can split a huge file into multiple parts and upload them in parallel using the above specified convention.

Quick deploy standalone PHP script to accept uploaded files?

I need to accept a large number of images from a 3rd party, and I already have an apache server up and running. As the 3rd party is not tech-savvy, I would like to give them a simple web form to upload files.
They don't need to be able to access the files they've uploaded, although I suppose it would be nice for them to verify what they've already sent, especially being that there is a large number of files.
There is also no requirement to be able to upload all files at once, and I think I can talk them through packaging the files into a 4-5 zip files, so single upload would be acceptable.
If I need to write a PHP script myself then so be it, but I was wondering if such a standalone script already exists in the wild, nice and polished etc :)
Thanks!
Nice ajax file manager:
http://www.ajaxplorer.info/wordpress/demo/
Others:
http://devsnippets.com/article/7-free-powerful-file-managers.html

Multiple file web uploader that uploads to remote FTP?

After tearing my hair out for the last week, I am looking for some sort of web uploader that allows my customers to upload a bunch of files (often up to 200) and store them to a remote FTP server. What I am looking for is something similar to uploadify, swfupload etc. but has the possibility to upload files via my web page (at my hosting company) and stored to my local ftp server.
I am looking for something similar to uploadify, swfupload and such, but it is absolutely critical that it has the possibility to store the files on my local server.
If this is somehow impossible to do, it could also just upload the files to my website via html (which uploadify etc. does) and after completion copy the files from the web server to my local ftp.
The closest thing i found was something called filechunker and it looked like the perfect solution, BUT it wont let me add multiple files, just one by one.
All help would be greatly apreciated!
Unfortunately I can't give you a concrete answer, but let me say that it should be theoretically possible to do for a Flash or Java application since they can use raw TCP sockets and implement the FTP protocol (but I am not aware of any Flash-based implementation).
If I'm not wrong all major browsers offer native file upload via FTP by browsing to the FTP directory itself (but you can't influence the visual appearance), just like Windows Explorer can access FTP servers and use them like a network drive.
However, I discourage you from using a FTP server at all. That protocol with it's double connection and that passive/non-passive modes often causes problems. It's usually much better to upload via HTTP and implement a HTTP-based file server yourselves, which is rather easy after all (but be very careful not to expose too much of your server's file system).
I see no real reason for using FTP unless you really want to allow your users to use their FTP client of choice, but that is contrary to your question.
Hope this helps.
Update: I just noticed the sentence "copy the files from the web server to my local ftp". In case you are really talking about two different servers I would still suggest a HTTP upload and then forward the file to the FTP server via the PHP script (your web server acting as a proxy).
I don't think it's feasible to upload directly from the browser to your FTP as you would have to have your credentials more or less visible on the website (e.g. in your javascript source).
I once created something similar, but because of that issue I decided to upload via plupload to Amazon S3 and sync the files afterwards via s3sync. The advantages were
Large filesizes (2GB+)
One time Tokens for upload, no need to send credentials to the client
no traffic to your web server (The communication runs client->s3)
Take a look at this thread for an implementation: http://www.plupload.com/punbb/viewtopic.php?id=133
After a wild search i finally found something that I could use. This java applet lets me upload endless amounts of files, zips them down and i managed to pass a php variable into the applet so the zip file is stored with the users e-mail adress as the filename. Cost me $29 though, but well worth it since I now have full control of where the files go, and who uploadeded them.

Upload 1GB files using chunking in PHP

I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.

Scan folder on local (user's) PC and upload all files(images) to web server

I wish my users could select a directory from their PC and upload all files from this directory, so they could upload whole album(directory) instead of uploading every single file separately.
I would like to ask you if this is somehow possible using PHP or JavaScript and without using any framework.
thank you
First of all, PHP can't do anything to the user's local computer. Since it never runs there (unless the user's computer is the server also).
JavaScript runs on the user's local computer but isn't setup to handle things like this.
Java and Flash runs on the user's computer and can be setup to do exactly this.
Look at SWFUpload. I highly recommend it.
And if you want Java, check out RadUpload. The lite edition is free.
A thing to note, what these Flash and Java solutions both do is accept a file selection from the user and then send that to a PHP script which does the actual uploading.
It would probably make more sense for them to upload a .zip containing multiple images - which is possible in PHP.
I do not think it is possible as you describe it. Create a small utility which they can run on their PC that will do the job. Also check out how Facebook upload image works. They upload dozens of images at the same time.
Not possible using purely php/javascript. However, take a look at http://www.element-it.com/JavaPowUpload.aspx, it is a java-based file uploader that allows you to completely hide the interface, and, if you wish, power the whole interface via javascript. However, it is not free, perhaps not suitable for a personal project.
This may not meet your requirement of Javascript, but if you wish you could build your uploader object as an activex object and use CURL to actually perform the upload or do it as a Java applet.
I had built a Java applet based uploader for a client and I found resources on line and used that as my base for building the uploader.
SWFUpload, as mentioned in one of the answers you received is a good one.

Categories