I am using url2png to get screenshots from pages. However, instead of asking the image every time, I want to save it onto an external server via FTP.
My first approach was to use:
$image = fopen($src,"r");
And then an ftp_fput. But as url2png may take about 5 secs to get the screenshot, the ftp_fput uploads an empty file.
Do I need to save the file locally first? or is there a workaround?
Thanks!
Found a solution using this question: Using file_get_contents and ftp_put
$fp = fopen('php://temp', 'r+');
fputs($fp, file_get_contents($src));
rewind($fp);
Related
How to generate a picture of graph of the results of voting and how to save the image in my server host, to continue to use it. How can I do this?
Generating Grapsh etc.
You can use link PHPExcel
and there is a possibility to save all generated files
eg: $objWriter->save('namee.xls'); check detail from the documentation
Saving without using PHPexcel etc.
if you have a graph or smth else and you want to save it on your server side
to save something you can use
$fp = fopen($name, "w");
fwrite($fp, $content); // this line saves
fclose($fp);
I'm trying to figure out how to pull an image from a third party website that gets changed every so often. Basically I am using Vbulletin software and would like to avoid a mixed content warning - hosting an image (HTTP) from another site onto mine (HTTPS). I would like the function basically to call the image in php, save the image to a folder on my server and then a php function to call the saved image on my server and display it. Thoughts? Thanks. I keep getting a failure to open stream on file_get_contents...
ob_start();
//Get the file
$content = file_get_contents("http://www.defconwarningsystem.com/current/defcon.jpg");
//Store in the filesystem.
$fp = fopen("/images/defcon/defcon.jpg", "w");
fwrite($fp, $content);
fclose($fp);
$html = ob_get_clean();
return $html;
Change
$content = file_get_contents("http://www.defconwarningsystem.com/current/defcon.jpg");
To
$content = file_get_contents("//defconwarningsystem.com/current/defcon.jpg");
And see if that works.
If file_get_contents doesn't work try cURL.
See example usage on php reference
There is a file located on a server. Lets call it "Movie".
My site's users need to downoad this movie, but the movie can only be downloaded by my website's IP.
Is there a way to download Movie to my server and from there to the user?
I can do that by "put_content" but I need the client to download the file WHILE my server downloads it.
Thanks!
You can make use of the standard wrappers PHP offersÂDocs and combine that with stream_copy_to_streamÂDocs:
Example / Demo:
<?php
$url = 'http://example.com/some.url';
$src = fopen($url, 'r');
$dest = fopen('php://output', 'w');
$bytesCopied = stream_copy_to_stream($src, $dest);
Related (couldn't find a better duplicate so far):
Remotely download a file from an external link to my server - download stops prematurely
Take a look at the readfile function.
http://php.net/manual/en/function.readfile.php
Does someone knows an good PHP Solution to delete or better wipe an file from an linux system?
Scenario:
File is encrypted and saved, when a download is requested the file is copyed to an temporary folder and decrypted. This is already working.
But how to remove the file from the temporary location after sending in to the user?
In my mind i have the following options:
Open the File via "fopen" and write 0,1 into it (think very slow)
Save file to Memcache instead of harddisk (could be a problem with my hoster)
Use somd 3rd pary tool on commandline or as cronjob (could be a problem to install)
Goal: Delete the file from hard disk, without the possibility to recover (wipe/overwrite)
Call "shred" via exec/system/passthru
Arguably the best is to never save the file in its decrypted state in the first place.
Rather, use stream filters to decrypt it on-the-fly and send it directly to the end-user.
Update
Your option 1 is actually not too bad if you consider this code:
$filename = 'path/to/file';
$size = filesize($filename);
$src = fopen('/dev/zero', 'rb');
$dest = fopen('/path/to/file', 'wb');
stream_copy_to_stream($src, $dest, $size);
fclose($src);
fclose($dest);
You could choose /dev/urandom as well, but that will be slow.
I have a page with html5 drag and drop upload feature and the file is uploading using PUT method. If I upload large image files, only part of the image is getting saved into the server. Im using the following PHP code to save the file
$putdata = fopen("php://input", "r");
$fp = fopen("/tmp/myputfile" . microtime() . ".jpg", "w");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
Anything wrong with this ? please help
I think is becos the entire file is not completely uploaded yet when you try to read, so it sometimes will return you zero bytes read. But there might still be data being uploaded.
Maybe you can try using the feof function to check if there is any more data to be read?
see "http://www.php.net/manual/en/function.feof.php"
If you are on Windows, you should add the "b" to the mode-parameter of fopen(). see manual BTW. it is only a good idea to add the param for code-portability...