I am looking to flock() an image.
Currently I am using the following
$img = ImageCreateFromPng($img_path);
flock($img,LOCK_EX);
It seems that the GD library's file handle is not valid with flock. How can I access the image and flock the file?
The function flock only works on file handles (or stream wrappers if they support locking). So, if you want to lock an image when you read it, you'd need to open it twice:
$f = fopen($imgPath, 'r');
if (!$f) {
//Handle error (file does not exist perhaps, or no permissions?)
}
if (flock($f, LOCK_EX)) {
$img = imagecreatefrompng($imgPath);
//... Do your stuff here
flock($f, LOCK_UN);
}
fclose($f);
$img in your example is not a file handle, it is a handle to a GD image resource in memory.
You can use imagecreatefromstring to load an image like this:
$file=fopen($fileName,"r+b");
flock($file,LOCK_EX);
$imageBinary=stream_get_contents($file);
$img=imagecreatefromstring($imageBinary);
unset($imageBinary); // we don't need this anymore - it saves a lot of memory
If you want to save a modified version of the image to the open stream you have to use output buffering:
ob_start();
imagepng($img);
$imageBinary=ob_get_clean();
ftruncate($file,0);
fseek($file,0);
fwrite($file,$imageBinary);
unset($imageBinary);
flock($file,LOCK_UN);
fclose($file);
flock only works with file pointers and ImageCreateFromPng only works with filenames. Try making two different calls:
$fp = fopen($img_path, 'r');
flock($fp, LOCK_EX);
$img = ImageCreateFromPng($img_path);
flock is cooperative, so it only works if everybody uses it. As long as ImageCreateFromPng doesn't use flock, the code above should work.
Related
I'm downloading a large file like that:
$fd = fopen($url, "r");
while(!feof($fd))
{
echo fread($fd, 4096);
ob_flush();
flush();
}
But I have one problem - the file is downloading only to 11,6 MB and stop...
Where is a problem? I'm using ob_flush and flush so I think - it should work.
Thanks.
You don't need the fread() loop if you just want to output a remote file. You can use:
readfile($url);
That's it. However, the script you showed should work as well. The reason must be on the remote server.
If the download takes long you should consider to set the execution time to unlimited:
set_time_limit(0);
... on top of your script.
Advise me the most optimal way to save large files from php stdin, please.
iOS developer sends me large video content to server and i have to store it in to files.
I read the stdin thread with video data and write it to the file. For example, in this way:
$handle = fopen("php://input", "rb");
while (!feof($handle)) {
$http_raw_post_data .= fread($handle, 8192);
}
What function is better to use? file_get_contents or fread or something else?
I agree with #hek2mgl that treating this as a multipart form upload would make most sense, but if you can't alter your input interface then you can use file_put_contents() on a stream instead of looping through the file yourself.
$handle = fopen("php://input", "rb");
if (false === file_put_contents("outputfile.dat", $handle))
{
// handle error
}
fclose($handle);
It's cleaner than iterating through the file, and it might be faster (I haven't tested).
Don't use file_get_contents because it would attempt to load all the content of the file into a string
FROM PHP DOC
file_get_contents() is the preferred way to read the contents of a file into a string. It will use memory mapping techniques if supported by your OS to enhance performance.
Am sure you just want to create the movie file on your server .. this is a more efficient way
$in = fopen("php://input", "rb");
$out = fopen('largefile.dat', 'w');
while ( ! feof($in) ) {
fwrite($out, fread($in, 8192));
}
If you use nginx as web server i want to recommend nginx upload module with possibility to resume upload.
I write an php script that help with limit speed and connections in download files. I used fopen() and fseek() something like this:
$f = fopen($file, 'rb');
if($f){
fseek($f,$start);//$start extracted from $_SERVER['HTTP_RANGE']
while(!feof($f)){
echo fread($f,$speed);//$speed is bytes per second
flush();
ob_flush();
sleep(1);
}
fclose($f);
}
download process may take several hours to complete, is whole file be in memory until end of download? and how I can optimize this?
No, fread uses an internal buffer to stream the data (8KB by default), so only a very small part of the file actually resides in memory.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...
I'm using CodeIgniter and I can't figure out how to unzip files!
PHP itself has a number of functions for dealing with gzip files.
If you want to create a new, uncompressed file, it would be something like this.
Note: This doesn't check if the target file exists first, doesn't delete the input file, or do any error checking. You really should fix those before using this in production code.
// This input should be from somewhere else, hard-coded in this example
$file_name = 'file.txt.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while(!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
Note: This deals with gzip only. It doesn't deal with tar.
gzopen is way too much work. This is more intuitive:
$zipped = file_get_contents("foo.gz");
$unzipped = gzdecode($zipped);
works on http pages when the server is spitting out gzipped data also.
If you have access to system():
system("gunzip file.sql.gz");
Use the functions implemented by the Zlib Compression extension.
This snippet shows how to use some of the functions made available from the extension:
// open file for reading
$zp = gzopen($filename, "r");
// read 3 char
echo gzread($zp, 3);
// output until end of the file and close it.
gzpassthru($zp);
gzclose($zp);
Download the Unzip library
and include or autoload the unzip library
$this->load->library('unzip');