PHP and SwfUpload - php

I'm trying to implement SwfUpload in my web page, and I'm using php to save files on server. As is the first time I use this component, I choose to run the algoritm is suggested from the SwfUpload team (http://swfupload.org/forum/generaldiscussion/214): I've put it in a file and I've said the control to use it as code file.
It didn't work, as I'm asking help, but what really drives me crazy is that i really don't know how to debug that stuff! The request to the file is encapsulated into the flash object, and i can't get any feedback from it if something goes wrong.
Anyone is more experienced than me about this control?
Thanks

One thing you can do if all else fails is to make upload.php append to a log file your debug messages. Example: file_put_contents("swfupload.log", print_r($_REQUEST, 1), FILE_APPEND);
The only tricky thing about swfupload that you need to understand, is that the flash component is run with a different cookie jar (for security reasons), so you need to manually tell it (via a flash param) the session_id you currently have on the server, so when it makes the http request to upload.php it passes that session_id in a $_GET param and the php script starts the session with that specific id: session_start($_GET['SESSION_ID']);. From that point on, upload.php behaves just like any other php code with your session data available. You get the $_FILES, move them to their respective folder, save them in db, and that's it.

Well SWFUpload needs to upload the file to some script, so you can log to a file from within that script based on data received...

Oh, that's a pain to debug.
One way to get to the script output (which might be some fatal errors that you can't log or something) is to use a proxy like fiddler that shows you all http traffic.
It's sometimes hard to get flash using the proxy. You may need to configure the proxy in IE even if you use another browser.

Related

Handling Very Large Uploads [duplicate]

I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however.
Browser:
HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all
Flash uploader puts entire file into memory before starting the upload
Server:
PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally.
The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like to be able to refuse the request right away, instead of after the entire file is uploaded.
Requirements:
HTTP is a must.
I'm flexible with client-side technology, as long as it works in a browser.
PHP is not a requirement, if there's some other technology that will work well on a linux environment, that's perfectly cool.
upload_max_filesize can be set on a per-directory basis; the same goes for post_max_size
e.g.:
<Directory /uploadpath/>
php_value upload_max_filesize 10G
php_value post_max_size 10G
</IfModule>
Python Handler?
Using a Python POST handler instead of PHP. Generate a unique identifier from your PHP app that the client can put in the HTTP headers. With mod_python to reject or accept the large upload before the entire POST body is transmitted.
I think
http://www.modpython.org/live/current/doc-html/dir-handlers-hph.html
Allows you to check headers and decline the rest of the POST input. I haven't tried it but might be the right path?
Looking at the source of mod_python, the buffering of the input via read() seems to allow bit-at-a-time evaluation of the HTTP input. Headers are first.
https://svn.apache.org/repos/asf/quetzalcoatl/mod_python/trunk/src/filterobject.c
It's old I know, but maybe someone have this problem nowdays ,too.
Now you can do this with only Javascript and, say, PHP. No Flash or Java required on client side.
demo: http://dnduploader.filkor.org/
The idea is to slice the files with Javascript's Blob slice() method...
How about a Java applet? That's how we had to do it at a company I previously worked for. I know applets suck, especially in this day and age with all our options available, but they really are the most versatile solution to desktop-like problems encountered in web development. Just something to consider.
You can set the post_max_size for just scripts in 1 directory. Place your upload script there, and allow only that script to handle large sizes. It's still possible for that script to be attacked with large/useless files, but it avoids setting it globally.
Use that with APC and you might be able to work out something good:
IBM Developer works article on APC
Tried all of this... this is by far the best I have used yet...
http://www.uploadify.com/
Take a look at jumploader.com
A good java-applet for uploading.
I've used it for uploading images and it works fine. Haven't tried with bigger files than 10MB, but i should work for really big files too.
Have you looked into using APC to check the progress and total file size. Here is a good blog post about it. It might help.
Maybe you could use Webdav and Javascript in the browser
AJAX Big file upload, with progress, to WebDAV
http://www.webdavsystem.com/ajax/programming/upload_progress
A simple library
http://debris.demon.nl/projects/davclient.js/doc/README.html
You can then get the JS to redirect the user to a success page. Secret keys and what-not can be handled in a PHP prelude before handing off the JS Client->WebDAV
I would look into FTP, SSH or SCP this allows you to upload a large file and still have access control over the file as well. This might take a little longer to implement but its probably the most secure way I could think of.
I know it sucks to add another dependency but in my experience, most websites that are doing something like this are using flash on the client side, and uploading the large file as chunks
adobe as a howto on flash file uploads
I also found this tutorial on codeproject:
Multiple File Upload With Progress Bar Using Flash and ASP.NET
PS - I know you're using PHP and not .net, I figured the important part was the flash ;)
I've had success with uploadify, and I would recommend it. It's a jQuery/Flash script that handles large uploads, and you can pass extra parameters to it (like the secret key). To solve the server-side issues, simply use the following code. The changes take affect just for the script they're called in:
//Check to see if the key is there
if(!isset($_POST['secret_key']) || !isValid($_POST['secret_key']))
{
exit("Invalid request");
}
function isValid($key)
{
//Put your validation code here.
}
//This line changes the timeout.
//Give it a value in seconds (3600 = 1 hour)
set_time_limit(3600);
//Set these amounts to whatever you need.
ini_set("post_max_size","8192M");
ini_set("upload_max_filesize","8192M");
//Generally speaking, the memory_limit should be higher
//than your post size. So make sure that's right too.
ini_set("memory_limit","8200M");
EDIT In response to your comment:
Given what you've said, I'm afraid you may not be able to meet your requirements over http. All of the solutions out there are code that add features to http that it was never designed for.
Like you said yourself, it's a simple protocol. Apart from writing your own client software that runs outside of the browser, a java applet, or using a different protocol (like FTP, which was designed for this), you might not get what you want.
I've done the best I could within the given constraints. Sorry I couldn't do better.
Try this: http://www.simple2ftp.com uses a Java based FTP applet from within a clever PHP application wrapper.

PHP security, is a proxy file the solution?

So if you have a PHP page, while if someone loads that page they may not see the server side run PHP code; if they grab the source, the file itself is still publicly available, because if you make it not publicly available the person would not be able to load that page.
Thus someone could with the right knowledge 'grab' that file and then read the serverside script stuff.
So is it not safer to make a 'proxy'. for example, AJAX post call to a PHP page (called script handler) and pass a string with the first 2 char being the id to the PHP script to run and the rest of the string being the data for that script, then the script handler runs and include based on the number and returns the echoed back HTML that is then displayed.
What do you guys think? I have done this and it works quite nice, if I grab source all I get is an HTML page with a div container and a javascript file with ajax calls to script handler.
No. Your 'workaround' does not fix the problem, if there ever was one.
If a client (a browser) asks a 'resource' (a page, for example) from a webserver, the webserver won't just serve the resource as it finds it on disk.
If you configured your webserver well, it will know that
An .html, .gif, .png, .css, .js file can just be served as-is.
A .php, .php5, .cgi, .pl file has to be executed first, and the resulting output has to be served.
So with a properly configured server (and most decent webservers are properly configured by default), grabbing the PHP source just by calling the page is impossible - the webserver will know to execute the source and return the result.
But
One of the most encountered bugs when writing your own 'upload/download script' is allowing users to upload/download .php (or other executable) files. If your own script 'serves' the .php file by reading it from disk and writing it to the net, users will be able to see your code.
Solution:
Don't write scripts unless you know what you are doing.
Avoid the not-invented-here syndrome (don't reinvent the wheel unless you are sure you NEED a better wheel AND can MAKE a better wheel)
Don't solve problems that don't exist!
By the by:
if your webserver was mal-configured and is just serving .php files as viewable/downloadable files, your 'solution' of calling it by ajax would not change this... Ajax still is client-side, so any client could bypass the ajax and fetch the script itself.
If your web server is configured correctly, users should never be able to view the actual contents of the PHP file. If they try, they should see the actual output of the PHP script as your web server reads and executes it, then passes that as the response to the HTTP request.
Furthermore, you need to understand that users can easily still look at the file the AJAX request is fetching; all they need to do is install Firebug, or use the Chrome developer tools, and they'll be able to see the full URL the file is fetched from.
So to sum up, firstly you shouldn't need to use this kind of 'security technique' for PHP files, and secondly, the 'security technique' will not stop anyone with more than a passing interest in your data.

Using uploaded file without saving to server disk

I have created a PHP script to upload a file, unfortunately I don't have permission to save files on the disk. I have to upload an excel file (using phpexcel), then I have to read all the rows in the file and save to disk, Is there any way for me to process this file without saving to disk, I tried to read $_FILES['file1']['tmp_name'] but it doesn't work.
could u please suggest a method to process this file
Thank you for the consideration
By "save to disk" you mean to send it back to the user for him to download it?
Usually, you shall have write access to (at least) the PHP temporary directory. Have you tried whether the form and script work in a local environment? Maybe there is something elso wrong with the upload?!
Finally: Why so you not have the persmission to save files? Are you allowed to create a subdirectory below you PHP file (via FTP) and give that one full permissions?
I tried to read $_FILES['file1']['tmp_name']
most probably you have just encountered an error.
that happens to beginner programmers very often
you have to repair that error instead of looking for odd workarounds.
Start from checking $_FILES['file1']['error']
what does
var_dump($_FILES['file1']['error']);
say?
Instead of sending your files with a form (multidata over HTTP POST), you can send your files with a little bit of Javascript with the HTTP PUT method to your server.
This scenario is described in the official documentation of PHP -> PUT method support.
Due some restrictions described in the documentation you have to do some workarounds to be able to work it properly.
You can read the direct input stream from your Webserver. The data will be piped from your Webserver to your PHP programm and will be only saved in memory.
To do a PUT Ajax call with jQuery was answered here. You can use a jQuery upload plugin like Uploadify.

How to setup PHP to be a "pipe" for downloads from another location?

I'm wanting to setup a php script and host it on my server that will let me download files from other locations, but making it look like it's coming from my server. Maybe using curl or htacess. Also I was hoping that there would be a way to get around having my server deal with the bandwidth. Does that make sense? Is this doable?
-- Update
Kind of like a proxy, but without the file downloading to memory and then sending it to the client.
You can do this by simply passing the target url to you script, open the url with file_get_contents(), curl or other file functions and echo the data. ensure to set the Content-Type header to "application/octet-stream" to force the browser to save the file instead of displaying it.
As for the bandwidth: You'll have to deal with it. If your server downloads a file, it will use up the bandwidth. It will even use it up twice because it has to receive AND send the data.
I don't know why you mention htaccess, because that has nothing to do with your problem.
Also I was hoping that there would be a way to get around having my server deal with the bandwidth. Is this doable?
No.
I'd recommend setting up a linking system on your site like http://example.com/download.php?id=12 that would then forward directly to the remote file, that way you'd save on bandwidth and if someone look at the link on your page it would look like it coming from your server. It would still show the other site in the download manager but if your trying to save bandwidth it's a small price to pay.
Thanks for the help... I figured out what I was needing to do, I'm going to use mod_xsendfile. It lets you set an external source for where the file is located, and then lets the user download the file without knowing where the file actually is located.

php blocking when calling the same file concurrently

i'm having some really strange problem.
i wrote a filemanager in PHP with the ability to download files -- which works fine.
the whole script is built as one big file.
now, while downloading a big file i'm not able to use the script at the same time for, say, browsing folder contents. it does nothing but keep loading. as soon as the download is finished everything works again.
is there something that prevents PHP from parsing the same file concurrently? because other scripts work like a charm, no matter if i'm downloading or not.
help or links to documentation are highly appreciated :)
Do you use sessions?
If yes, then that's probably the problem. The default session handler uses files which have to be locked while session-enabled code is executed. Practically this means that each user executes PHP files sequentially. To solve this you must use a custom session handler that uses a DB. Read this.
Edit: I want to point out that writing a custom session handler with no locking can be difficult and introduce various subtle bugs. Read more docs on this if you need to do it!
Edit 2: Sometimes using session_write_close() to close the session when no longer needed is enough (see the comments).
Daremon is correct, but you shouldn't need to use a different session handler. If you call session_write_close() before you start sending the file, the lock on the session file will be released and your other scripts should be able to continue.

Categories