Uploading of File seems to be forever! - php

I have made a program wherein I am able to upload a file. Everything is working fine.
But, when I tried uploading 11mb file, it seems like it is loading forever or sending the file to server forever..
I have already tried setting the upload_max_filesize to 20M.
Any ideas what could be the cause and how to resolve this?

The why is almost certainly related to your connection speed. Unless you're connecting via a LAN to the machine in question you'll in all probability be connecting via a consumer grade broadband connection. These are pretty much always configured so that download speed is a lot higher than upload speed. As a consequence, that 20 meg file that takes a minute or so to download will take 10 minutes or more to upload over the same connection.
What can you do about this, other than switching to an enterprise-grade broadband connection? Not a great deal, those bits will only transfer so fast over the connection you've got and no faster. What you can do, however, is at least keep the user informed as to how the upload is progressing. PHP from version 5.2 onwards provides hooks that you can use to monitor the progress of a file upload. You can use javascript to monitor these hooks and display a progress bar to the user.
http://www.phpriot.com/articles/php-ajax-file-uploads

How long is "forever"? A typical upload speed on consumer broadband is 256 kilobits per second, at which speed an 11 megabyte file will take over five minutes to upload.
If you are using the Google Chrome web browser, you get an upload progress bar so you can tell if it is working or not.

Related

Apache/PHP response time and execution time

I have a web server application with Apache, PHP and MySQL in Windows Server 2008. The server also serves web pages and images.
Recently I have noticed that some users (8 users out of 150) that upload images have a response time from Apache of 200 seconds for example, but the execution time of the PHP script is 2 seconds. But other users are not affected and they're using the same script.
I know this times because I'm logging each request in a MySQL table.
To obtain the apache response time before the execution ends I use
microtime(true) - $_SERVER["REQUEST_TIME_FLOAT"]
And to obtain the PHP execution time I use
microtime(true) - $GLOBALS["tiempo_inicio_ejecucion"];
where $GLOBALS["tiempo_inicio_ejecucion"] is another microtime that I get at the beginning of the script execution.
The server load is low, CPU and RAM are far of their limits.
If I try to reproduce this behaviour uploading files from my PC, I can't reproduce it, it uploads fast.
I suppose is some network issue, but I can't get it solved, or maybe is a network issue of the clients.
How can I know what is happening here?
Thanks in advance.
Possible suggestion: A virus checker.
Your server may have a virus checker installed which scans files automatically when they are created. This will be scanning uploaded file. It is possible that it may be running low on resource or being given a low priority by the server, and thus scans of the uploaded files are taking a long time. However it won't release the file to the web server until the scan is complete, and thus the server takes a long time to start running the PHP code.
I have no idea if this is actually your problem, but I have seen similar problems on other Windows Server boxes. It can be a very difficult problem to diagnose.

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

synchronized timer on multiple networked computers

i am working on an art/programming project that involves using a lab of 30 imacs. i want to synchronize them in a way that will allow me to execute a script on each of them at the very same time.
the final product is in flash player, but if i am able to synchronize an type of data signal through a web page, i'd be able to run the script at the same time. so far my attempts have all had fatal flaws.
the network i'm using is somewhat limited. i don't have admin privileges but i don't think it matters really. i log into my user account on all 30 imacs, run the page or script so i can run my wares.
my first attempts involved running flash player directly.
at first i tried using the system time and had the script run every two minutes. this wasn't reliable because even though the time in my user account is synced there is discrepancy between imacs. a quarter of a second is too much even.
my next try involved having one mac acting as the host which writes a variable to a text file. all other 29 flash players checked for changes in this file multiple times a second. this didn't work. it would work with 3 or 4 computers but then would be flaky. the strain on the server was too great and flash is just unreliable. i figured i'd try using local shared objects but that wasn't reliable. i tried having the host computer write to 30 files and have each mac read only one each but that didn't work either. i tried using local connection but it is not made for more than two computers.
my next try involved having a php server time script run on my web server and have the 30 computers check the time of that files nearly 30 seconds. i don't think my hosting plans supports this because the server would just stop working after a few seconds. too many requests or something.
although i haven't had success with a remote server, it will probably be more reliable with another clever method.
i do have one kludge solution as a last straw (you might laugh): i would take an audio wire and buy 29 audio splitters and plug all of them in. then i would run flash player locally and have it execute when it hears a sound. i've done this before. all you have to do is touch the other end of the wire and the finger static is enough to set it off.
what can i do now? i've been working on this project on and off for a year and just want to get it going. if i can get a web page synchronized on 30 computers in a lab i could just pass data to flash and it would likely work. i'm more confident with a remote server but if i can do it using the local mac network, that would be great.
Ok, here is how i approached my problem using socket connection with flash and php. Basically, first you setup a client script that is to be installed on all 30 imac 'client' machines. lets assume all machines are on a private network. When these clients are activated they are connected to a server(php), by using socket. The php server script would have an ip and a port that these clients connects to, handles client connections pool, message routing and etc, and the server will be running at all time. The socket connection allows the server-client interaction by sending messages back and forth, and these messages can trigger things to do. You should read up more on socket connection/server client interaction. This is just a little summary of how i got my project done.
Simple tutorial on socket/server client connection using php and flash

Why would file uploads stop on busy LAMP server?

We have a lamp server that is fairly busy, the CPU usage hovers around 90% at peak times. We are having an intermittent problem where file uploads from web forms fail. It only seems to happen with larger files (over a mb) and it seems to affect some users more than others. We've gone through and checked the obvious stuff like PHP ini max upload sizes, execution times, and folder write permissions. Also, the site worked for a year without trouble like this and it suddenly began (we don't think any of our application php would cause this).
We've watched what happens in Charles Proxy and it shows the upload happening (the sent filesize increases regularly) until it just stops sending data. The browser just shows it's spinning progress like it's proceeding but you can wait 20 minutes and nothing will happen or it reports a timeout.
Does anyone know why an upload might fail intermittently? My only guess is that maybe it has to do with server traffic, like apache is closing that connection prematurely.
If the load on the server is high, then your scripts may be timing out while trying to upload the file. Can you give any more specifics on your problem? I think PHP scripts have a 30 second timeout by default, meaning that if the script has not completed i.e. uploading the file within that time frame then the script will timeout and the upload will fail.
It seems like if the site worked for over a year, and now traffic has grown to the point where it is starting to strain the load on the server then it is possible that scripts may be timing out given the increased traffic and load on the server.

Wheres the bottleneck?

I have two test computers networked together.
One has a gigabit Ethernet, the other a 10 megabit.
Theoretically, data transferred between the two should reach about 1 megabytes p/s.
Now I'm using a PHP script to read data from one host to another using fread. Both reading file and file to be read are chmod 777.
Both computers are running wampserver and both have zonealarm and avast installed and running. Zonealarm is configured to recognise both computers as trusted parts of the network.
Using the time() function to work out the time it takes for the script to read a file on the other comp. The file im reading is 10 megabytes. It should take just over 10 seconds. Yet it takes around 30 seconds. Average 300kbs.
So where is the bottleneck in my setup?
One comp is Vista, other is XP if that matters.
Just because your network speed is 10Mb/sec doesn't mean that the application layer gets that. There is TCP/IP header information (~64 bytes per 1500 byte packet), time processing the buffers in the kernel, time spent doing buffer transfers to/from the LAN controller chip, etc.
I assume when you said you're getting 300kbs you really mean 3mbs, right?
While there's a lot of guesses we can take, this probably belongs on serverfault as you are not asking programming wise what the issue is, and honestly, even there, this will take a lot of trial and error. Not really suitable for question/answer.
Open up the task manager (ctrl+alt+delete, task manager), then switch to the second tab (or the third?) and watch the CPU and network usage as you run the test. If the CPU usage is at 100%, that may be the bottleneck. Check the network usage too to see if there is any overhead you don't expect.
That's where I'd start.

Categories