I originally posted this over in the Drupal exchange but it was suggested I try here since it seems to be more related to my server, not Drupal.
When uploading files through php, the upload speed can see drastic slow downs. For example, I'm uploading a 400MB video and I'll be getting 10mbps then, suddenly, it drops to less than 100kbps. After some time, it speeds up again. Then it slows down again. And then repeats. There is no consistency as to when this happens. I can get this to replicate with large files and small files but since it is erratic, it's harder to observe on small files. I have not observed this uploading through SCP, so I assume it's not a network issue.
Here is what I know.
It's not my internet connection. I've tried multiple people from different places with the same results.
Tried on several browsers with the same result.
Eventually the upload will complete but in a lot of instances, an upload that should take 4 minutes ends up taking 30 or more.
PHP is set to allow 2GB file uploads. And PECL uploadprogress is installed.
Though I use Drupal, I've also tried uploading files through a straight-forward php upload form with the same results. So it's not a Drupal issue.
What I think is happening - at some point during the upload process, some sort of buffer is being hit. I really don't even know where to begin looking for that at the server or OS-level
We have plenty of disk space (more than 1TB) and plenty of RAM (24GB)
I'm hoping someone here has experienced similar or can suggest where to begin looking. Thanks for reading!
Related
Even if there seem to exist a few duplicate questions, I think this one is unique. I'm not asking if there are any limits, it's only about performance drawbacks in context of Apache. Or unix file system in general.
Lets say if I request a file from an Apache server
http://example.com/media/example.jpg
does it matter how many files there are in the same directory "media"?
The reason I'm asking is that my PHP application generates images on the fly.
Once created, it places it at the same location the PHP script would trigger due to ModRewrite. If the file exists, Apache will skip the whole PHP execution and directly serve the static image instead. Some kind of gateway cache if you want to call it that way.
Apache has basically two things to do:
Check if the file exists
Serve the file or forward the request to PHP
Till now, I have about 25.000 files with about 8 GB in this single directory. I expect it to grow at least 10 times in the next years.
While I don't face any issues managing these files, I have the slight feeling that it keeps getting slower when requesting them via HTTP. So I wondered if this is really what happens or if it's just my subjective impression.
Most file systems based on the Berkeley FFS will degrade in performance with large numbers of files in one directory due to multiple levels of indirection.
I don't know about other file systems like HFS or NTFS, but my suspicion is that they may well suffer from the same issue.
I once had to deal with a similar issue and ended up using a map for the files.
I think it was something like md5 myfilename-00001 yielding (for example): e5948ba174d28e80886a48336dcdf4a4 which I then put into a file named e5/94/8ba174d28e80886a48336dcdf4a4. Then a map file mapped 'myfilename-00001' to 'e5/94/8ba174d28e80886a48336dcdf4a4'. This not-quite-elegant solution worked for my purposes and it only took a little bit of code.
I am using this extension to perform export/import for the product database on our website. The site is pretty snappy--loads quickly, on a killer server, everything functions flawlessly except for the product import/export.
Everything worked fine up until the point where we had about 12,000 products in the catalog. Now, it appears that product import works fine. Problem is, exporting products is choking. Here's what happens... click export, hangs for about 10-12 minutes (during which time the site goes down, unless I kill the process via CLI), then goes to a "page not found" error, the same link the admin export function was accessing.
Technical data & stuff I have tried or considered...
Import export code may be downloaded here. Opencart is based on MVC framework, so controller and model are obviously the important files to look at.
I have upgraded the original plugin to utilize the absolute latest version of PHPExcel and Pear library, with OLE and Spreadsheet extensions--both utilized by the import/export module.
PHP.ini settings are maxed out--allowing up to 8 gigs of RAM, post_max_size, max upload, and all other settings are pretty much maxed out. Server is running dual quad-core Xeons with a number of SAS hard drives, with an average processor usage around 3%. So, it's not the server, and unless I'm missing something, it's not the php settings that is the root of the problem here.
There are no errors being thrown in the error log that would indicate any specific problems in the code. Just the fact that it was working before, and now locks up while exporting products when more than 12k products are in the DB.
I have tried repairing the product tables, optimizing the database, and re-installing the base Opencart framework.
I realize this is a pretty general question here, but I'm at my wits end. I am not going to code a custom import/export module from scratch to nail this problem down. Simply hoping that someone might be able to shed some light (extension author has not been able to answer this issue). I've picked this thing apart from top to bottom and can't find any reason why it wouldn't be working the way it should.
10 - 12 minutes is a relatively short time for larger imports. I've seen them last over 45 minutes before now on dedicated servers with plenty of RAM. It had the same problem as yours where the site became totally unresponsive during the upload/import. The problem is the inefficiency of using excel and decoding all of the values from the saved excel sheet. I did actually custom code an efficient version for a client back on 1.4.X that did this but it was by no means pretty and took quite a lot of debugging. The actual export was massively inefficient too just joining all the tables together and taking up vast amounts of memory (over 1.8GB if I remember correctly). This too was massively reduced by selecting smaller duplicated rows and parsing them separately then inserting them back into arrays for the export data. It was quite incredible at just how much faster this was
The solution was simpler than I would have ever imagined.
Export tool that actually works for large product databases (and of course, exports to CSV).
Here is the link.
I'm looking at building a web app that includes a file upload element. I'd like users to be able to upload files of any type and of fairly large size (say, up to 100MB). This will be a publicly accessible site, so security is obviously very important.
I've done a decent amount of googling in search of answers, but it's difficult when I don't really know exactly what I'm searching for.
My experience is mainly with PHP, but I realise that PHP is not considered to be the best when it comes to file uploading, so I'm happy to look at other languages if necessary. Although, if a decent solution using PHP can be acheived, that would be preferable.
As I have no experience with this kind of project, I'm also fairly in the dark on what kind of server setup is required for such an app.
I have braistormed a few ideas, but am willing to budge on them if unreasonable:
I'd like to use Amazon S3 to store the files if possible (to reduce the load on the server)
I'd like to be able to rename the files after upload
I'm considering Uploadify (uploadify.com) for the client side
Basically, imagine I was looking to build a file-sending app like wetransfer.com or yousendit.com and you'll get the general idea.
I'm familiar with all the usual PHP file upload issues (checking mime-types, upload_max_filesize, memory_limit, etc, etc) covered by 99% of posts on the internet on this topic, but obviously this project goes a fair bit beyond your average, run-of-the-mill avatar upload script.
I know this is a massive topic and I'm obviously not expecting anyone to present me with a magic solution, but basically I'm looking for some pointers on where to start. Can anyone recommend any good books, articles or websites where I can gain a better understanding of the requirements of the task? Covering everything from the programming to the server requirements? Even if it's just a list of keywords or phrases that I should be googling.
Thanks in advance!
P.S. I wasn't 100% sure if this was the right StackExchange site to post this question on. I also considered serverfault.com and webmasters.stackexchange.com. If you think this question would be better asked elsewhere, please let me know.
If you funnel the upload through your PHP you need to make sure that it accepts those large files. Especially upload_max_filesize, post_max_size and max_input_time. See POST method uploads for a general how to.
With Resumable.js you could circumvent above limitations quite nicely. It uploads small chunks of your 100MB at a time. This allows it to keep track of what's been uploaded to allow pause/resuming uploads.
While I've never worked with Amazon S3, I do not believe you can upload data from any client - at least not without some sort of authentication. You'll probably have to funnel the upload through your own server in order to push it to S3.
I'm developing a webapp in PHP, and the core library is 94kb in size at this point. While I think I'm safe for now, how big is too big? Is there a point where the script's size becomes an issue, and if so can this be ameliorated by splitting the script into multiple libraries?
I'm using PHP 5.3 and Ubuntu 10.04 32bit in my server environment, if that makes any difference.
I've googled the issue, and everything I can find pertains to PHP upload size only.
Thanks!
Edit: To clarify, the 94kb file is a single file that contains all my data access and business logic, and a small amount of UI code that I have yet to extract to its own file.
Do you mean you have 1 file that is 94KB in size or that your whole library is 94KB in?
Regardless, as long as you aren't piling everything into one file and you're organizing your library into different files your file size should remain manageable.
If a single PHP file is starting to hit a few hundred KB, you have to think about why that file is getting so big and refactor the code to make sure that everything is logically organized.
I've used PHP applications that probably included several megabytes worth of code; the main thing if you have big programs is to use a code caching tool such as APC on your production server. That will cache the compiled (to byte code) PHP code so that it doesn't have to process every file for every page request and will dramatically speed up your code.
Before anyone rips me a new one...I HAVE PERMISSION to hotlink images from an external site. It works all good, however I don't like that everytime i refresh the page it pulls the images again. My server is running PHP, is there a way to cache the images once, then display them via some local code. I'm really just looking for a way to speed up the page, and not waste anyones bandwidth. Thanks in advance.
I was looking for an answer to this myself and didn't find anything that fit my needs perfectly. TimThumb came close (you'll have to Google it; I'm a newbie and can thus only post one hyperlink), but it was a little overkill (it has all kinds of image manipulation stuff built-in) and couldn't handle some of the image types I was interested in using (specifically *.ico files). So I wrote my own quick-n-dirty PHP script that should handle any image type and is only concerned with caching the images alone and passing them through without any modifications.
I'm a bit concerned my script may have glaring security flaws or could be more efficient. Also, it's not very smart the way it caches. It never bothers to check later to see if the image has been updated, and it never bothers to clean up its own cache. If anyone has suggestions for improvements to my code, I'm open to feedback.
Here's the script: Warm linker - RefactorMyCode.com
You might consider using a proxying CDN like CoralCDN.