I created an API for data in a website.
This API creates a php file, which is put in a cache directory.
(Data are stored in an array and json_decode)
It works until my file weighs less than 5-6 Mo, after this size i have a white page.
Several possibilities come to mind
Php limit size : i try to insert ini_set('memory_limit', '1024M'); but no
JSON limit size
Server limit size
I know my question is very "open" but any help will be appreciate
Related
I'm using XAMPP for my project. I'm trying to upload really big images and I've noticed that it doesn't work with all images.
After trying it out a few times I came to the conclusion that images which have a higher resolution than something about 6500px in width do not upload.
I've also found out that the file size doesn't seem to matter since a 1.4MB Image with a resolution more than 6500px won't upload but another with 4.8MB but small in resolution uploads without any problem.
Somehow the reason why the image is not being uploaded is with the resolution and not with the file size.
The only code I've to show for is the upload. However there's nothing special about it. As mentioned, other images upload perfectly fine, only the ones with a too high resolution don't.
php code:
move_uploaded_file($imageUploadFile, $taget_original)
php.ini
post_max_size=10000M
upload_max_filesize=10000M
Is there any solution to this problem? Do I need to specify somewhere that I want to upload high resolution images?
This is really important since I want to be able to upload 8k to 16k images. At the moment this doesn't work even if the file size should be small enough, it won't upload the image for some reason.
I wouldn't be looking in the upload size department but in the (allowed) memory size department (e.g. memory_limit. I bet you're using ImageMagick or something to actually do something with the image.
Also see here and here. Just make sure you read the documentation because the values are supposed to be specified in bytes, not megabytes (also see the comments on those answers).
I would try something like:
$limit = 2 * (1024 * 1024 * 1024); // 2Gb
// set memory limit
ini_set(‘memory_limit’, $limit); // For testing purposes you could try -1 (for unlimited) instead of $limit
// pixel cache max size
IMagick::setResourceLimit(imagick::RESOURCETYPE_MEMORY, $limit);
// maximum amount of memory map to allocate for the pixel cache
IMagick::setResourceLimit(imagick::RESOURCETYPE_MAP, $limit);
What the actual limit is supposed to be I guess will have to be found out by trial-and-error and will also depend on the amount of memory available ofcourse. If you're on shared hosting then this might (or: most likely will) be a problem.
I had a similar case in the future. Quite a strange solution, but it worked for me.
Try to specify the size with MB, not M
upload_max_filesize = 256MB
post_max_size = 256MB
It should work. If not, try to increase memory_limit
I hope it helps
Some Updates:
I've looked into my javascript programming a little and found a few interesting non working implementations.
It seems that this was all a client side problem.. Or at least I think it is. For Some reason my onprogress function doesn't work correctly. I've tried to upload Images with a bigger delay and sometimes this worked out.. other times it didn't though.
I'm not really sure if the client side problem is causing all of this. I'll probably just have to fix the front-end issue and hope the backed issue resolves itself.
Either way I'm going to update this question as soon as I've tried to fix everything.
There are several places where this can fail:
the size of the POST allowed by the webserver (it base 64 encoded hence larger than the file size)
the time limit allowed by the webserver for a client to make a request
the max upload size allowed by PHP
the memory available in PHP to load and process the image (assuming you do anything other than move_uplaoded_file()
Except for the last of these, it has nothing to do with the dimensions of the image.
I'm working on creating a photo upload form and I'm running into some trouble. Essentially, a user fills out some basic demographic data, checks a media release, selects a photo, and uploads it. I then use a few nested if statements to validate that it's the correct photo size, type, etc.
At times it works just fine, but with certain photos I've been getting this error:
PHP Warning: POST Content-Length of 11310075 bytes exceeds the limit
of 8388608 bytes
Followed by a bunch of
PHP: Notice: Undefined Index
for each of elements in my $_POST array. I did some digging with phpinfo() and found that memory_limit is set to 128M...so I'm confused as to what's going wrong.
I'm using MODX, Apache/2.2.25
Thanks for your help!
The issue is not about memory, but max upload/post data limit. Please check you phpinfo() for:
post_max_size
upload_max_filesize
These values should be increased. This can be done by editing php.ini file, or by set_ini() function.
You can set unlimited memory usage with this code.
ini_set('memory_limit', '-1');
During the resize of an image I receive the error "Out of Memory" of PHP. I can solve it by set a greater value in my php.ini but what happens if I can't modify my php.ini (and also I can't set this value runtime with PHP) due webhosting security policy?
I upload the image using a normal post with $_Files, my memory_limit is 32 mb. How can I calculate if an image will cause this error during the resize? The uploaded photos may have different formats and weight; I’m trying to resize to an unique width of 820px.
EDIT
I have found this site http://www.dotsamazing.com/en/labs/phpmemorylimit
Maybe i have to replicate this calculation but it seems very hard to do.
It looks like you would want to get the image size before you attempted to resize it.
For example once you have a temporary path for the file on the server, pass it to the function:
$image_size = getimagesize('/path/to/image');
This will provide you will lots of info about the image, including it's size.
Once you have tthis, then subtract your memory_limit, which you can get the value of with:
$my_memory_limit = ini_get('memory_limit');
So your available memory becomes
$memory_available = $my_memory_limit - $size;
At which point you can make the decision of whether you have the memory to resize it (depending on how memory costly your re size process is, or if you kick it back to the user and tell them to use a smaller image.
We're developing a website where users may change their slider images. (fullscreen slider) Each image is around 2000px to 2000px and we allow users to upload as many images as they want in our HTML form (capped to 10).
Their upload speed will be pretty slow and I believe we'll easily pass the max_execution_time of PHP which is default to 30 seconds. We will also let users to upload some .rar/.zip files in the future capping at 100MB.
We had few ideas but I wanted to ask SO for a better solution/reviews.
We can change 30 seconds for alot higher value since we have access to PHP.ini and let users upload all images at once, but that may create performance related issues in long term. (This is not an option!)
We can make use of javascript in client size. Foreach image paths specified in HTML form, javascript may post it with XMLHttpRequest one by one and expect a response. If the response is true, javascript moves to the next images and attempts to upload it. (Profit: each image will start php itself and get their own 30 seconds lifetime.)
Javascript solution won't work in file uploads when the file is above 50MB. Customers are usually capping at 25kbps upload speed in target region so there is no way they can upload 50MB in 30 seconds. Similar to #2, we may use a script where uploaded file saves in bytes every 30 seconds and client continues to push remaining bytes, or anything alike.
Basically, how would you complete this task?
We don't want to rely on PHP.ini, so increasing max_execution_time shouldn't be an option.
Should we go with #2 for image uploads and what can you suggest for #3?
Take a look into chunked uploads.
You'll need to use some sort of uploader script like JUpload or plUpload. You can specify how large the chunk of a file should be sent to the server. For example, if you have a 10MB file, you can chunk it to 1MB, so 10 1MB chunks would be uploaded to the server. In the case of a slow connection, just make the chunks smaller like 500KB.
I’ve tried to get over a millions of rows from a MySQL table to PHP my application.
Steps to get data are like bellow.
Send query to mySQL with mysql_query() or mysqli_query()..
Set result to array with looping each rows by using mysql_fetch_array().or add array by using mysqli_fetch_all().
I took messages ‘out of memory’ or ‘allowed memory size of * bytes exhausted’.
I might be able to solve them if I change ‘memory_limit’ .
But I want know why those message was shown.
I’ve changed ’memory_limit’ like 128M-> 512M ->1024M.
When I set 128M,sending query was failed with allowed memory size of *bytes exhausted and I can’t add result to array area.
Then setting 512M, query was finished successfully but I can’t add result to array with allowed memory size of * bytes exhausted.
Finally setting 1024M,both sending query and set result to add result to array was finished .but sometime mysqli_fetch_all() was field with Out of memory.
One thing I can’t understand is why Out of memory isn’t happen constantly?
One more thing I want know is How to get whole rows over a million from a table into PHP less memory turning. After getting whole rows I want let my user to access them by browser operation including download some types of file like csv.
You will get "allowed memory size of * bytes exhausted’" when the amount of memory needed exceeds the memory limit specified in php.ini file.
You will get "Out of memory" when the server itself (or Apache or MySQL) runs out of memory.
The best option to retrieve huge number of rows is to use limit and offset in your query and loop through them.
This will prevent the memory issues that you are facing.