<form action="php.php" method="post" enctype="multipart/form-data">
Send these files:<br />
<input name="img[]" type="file" multiple="multiple" /><br />
<input type="submit" value="Send files" />
</form>
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
foreach ($_FILES['img']['tmp_name'] as $index => $tmpName) {
if( !empty( $tmpName ) && is_uploaded_file( $tmpName ) )
{
$handle = fopen($tmpName, "r");
$data = fread($handle, filesize($tmpName));
$client_id = "d5f419ef9aedf16";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://api.imgur.com/3/image.json');
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Authorization: Client-ID ' . $client_id));
curl_setopt($ch, CURLOPT_POSTFIELDS, array('image' => base64_encode($data)));
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch, CURLOPT_STDERR, fopen('php://stderr', 'w'));
$reply = curl_exec($ch);
$info = curl_getinfo($ch);
var_dump($info);
if(curl_errno($ch))
echo 'Curl error: '.curl_error($ch);
curl_close($ch);
$reply = json_decode($reply);
printf('<img height="180" src="%s" >', $reply->data->link);
}
}
?>
I made this page, works perfectly on localhost but when I run it from my server it does not work, problem is I have no output at all, not even the curl_getinfo or errors. Dont know how to debug this since I can't get any info out of it.
Thank you hans for the suggestions, you were right, it was not mandatory to use base64, not in V3 API, it was on the previous versions, not anymore. Now it's much faster. This is my final code
if(!empty($_FILES['img']['tmp_name'])){
foreach ($_FILES['img']['tmp_name'] as $index => $tmpName) {
if( !empty( $tmpName ) && is_uploaded_file( $tmpName ) )
{
if ($handle = fopen($tmpName, "rb")) {
$data = stream_get_contents($handle, filesize($tmpName));
$client_id = "d5f619ef9aedf16";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://api.imgur.com/3/image.json');
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Authorization: Client-ID ' . $client_id));
curl_setopt($ch, CURLOPT_POSTFIELDS, array('image' => $data));
curl_setopt($ch, CURLOPT_VERBOSE, true);
$reply = curl_exec($ch);
if(curl_errno($ch))
echo 'Curl error: '.curl_error($ch);
curl_close($ch);
$reply = json_decode($reply);
$screens .= $reply->data->link . " ";
fclose($handle);
}
}
}
}
The reason why it was not uploading them images was upload_max_filesize and max_post_size, I've increased them and I had no problems.
you do several things wrong.
this code:
foreach ($_FILES['img']['tmp_name'] as $index => $tmpName) { will be executed even on GET requests, which contains no files, thus $_FILES['img'] does not exist, and you get an Undefined index: img error. furthermore, on GET requests, this undefined index bug returns NULL to foreach, and when you tell foreach() to loop a NULL, it gives you this error: Invalid argument supplied for foreach(). the fact that you haven't already caught up on this, proves that you're not checking the php error logs. this is the first thing you should do, when debugging a malfunctioning php script. - to fix both of these, wrap the foreach in if(!empty($_FILES['img']['tmp_name']){} - next, find your PHP error logs, and read those error reports. (usually you can find it in phpinfo() under error_log - or in case its emptystring, its your web servers' error log, eg for nginx with an empty php error_log php setting, its in nginx's error log (like /var/log/nginx/error.log )
next issue, $handle = fopen($tmpName, "r"); this will corrupt binary data (like images) on some systems (famously, windows), change it to rb for safely reading binary data (and image formats are binary)
next issue, you believe that fread() reads as many bytes as you request it to read, that is incorrect. fread will read UP TO as many bytes as requested, but can stop for many reasons before it has read the requested number of bytes. you'll either have to loop calling fread until it has read all the bytes, or better yet, use stream_get_contents, which will essentially do that fread() loop for you.
next issue, this line curl_setopt($ch, CURLOPT_STDERR, fopen('php://stderr', 'w')); is completely unnecessary, PHP's STDERR is libcurl's default STDERR anyway.
next issue, you don't fclose() $handle. this is ok for short running scripts (becuase the OS fclose() it for you when the script finishes), but the bigger your code grows, the longer the resource leak will be there, its a good programming habit to always close it, so do that.
next issue, i can't believe imgur actually wants a base64 encoded copy of the image, that is such a cpu & bandwidth & ram waste... the POST encoding they're using, multipart/form-data is fully binary safe, so there's no reason to base64 encode it... are you sure they want it base64 encoded? you AND them can save about 33% bandwidth by NOT using base64 encoding here.
given all the things you are doing wrong with the file operations, it'd be better to use the file_get_contents function - it reads the entire file, it opens the file in binary read mode (as opposed to your code, where you open it in TEXT mode, potentially corrupting the image as you read it), and it reads the entire file, not just the chunk of it that the first fread() call will read (which, if you're lucky, is the entire file, but if you're not lucky, is just the first chunk of it), and it does fclose() for you (which you forget to do)
Why dont you use imgur php sdk and simple implement it with below code:
require_once APPPATH."/third_party/imgur/vendor/autoload.php";
$pathToFile = '../path/to/file.jpg';
$imageData = [
'image' => $pathToFile,
'type' => 'file',
];
$client->api('image')->upload($imageData);
Related
I am trying to write csv files with fwrite/fputcsv in a loop but when I use those functions, they write in the first file and then the tab loads infinitely and doesn't do anything else.
I know those functions are the problem because when I remove them the rest of the code runs properly.
Here's my code :
<?php
$auth = '';
$date = getdate(); //Needed for the Trade lists files
$path = array(
'Some_folder/file1.csv',
'Some_folder/file2.csv',
'Some_folder/file3.csv',
'Some_folder/file4.csv'
);//Full path the files need to be in
$file = array(
pathinfo($path[0])['basename'],
pathinfo($path[1])['basename'],
pathinfo($path[2])['basename'],
pathinfo($path[3])['basename']
);//The names the files will have
$fp = array(
fopen($file[0], 'w+'),
fopen($file[1], 'w+'),
fopen($file[2], 'w+'),
fopen($file[3], 'w+')
);// Files that are create by fopen
for($i = 0; $i < 4; $i++) {
fputcsv($fp[$i], $file);
echo 'test';
$size = filesize($path[$i]);
$cheaders = array('Authorization: Bearer ' .$auth.'[Private access token]',
'Content-Type: application/octet-stream',
'Dropbox-API-Arg: {"path":"/'.$file[$i].'", "mode":"add"}');
$ch = curl_init('[Private information]');
curl_setopt($ch, CURLOPT_HTTPHEADER, $cheaders);
curl_setopt($ch, CURLOPT_PUT, true);
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'POST');
curl_setopt($ch, CURLOPT_INFILE, $fp[$i]);
curl_setopt($ch, CURLOPT_INFILESIZE, $size);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
curl_close($ch);
fclose($fp[$i]);
}
?>
fputcsv and fwrite do write in the first file (file1.csv) but they make the program stuck in an infinite loading. The console doesn't say anything since I can't load the page and I don't know what else to try/do. I tried looking for a solution, unfortunately I didn't find it so here I am. I'll still look for an awnser but if someone has a solution, I'll take it too.
Thanks to all those who will try to help.
EDIT: It finally loaded but it took a really long time since it started loading approximately an hour ago. Is there something I did wrong for my program to be so slow to write so slowly in a file ?
I would suggest you to split the task in two sections. First write the files in your server using fopen in w mode. After writings, use another loop for reading the files using fopen in r mode for using in your cURL to upload in the server. Using the same handler for writing and uploading might cause the lagging problem.
You can use curl_multi_init for handling multiple cURL requests asynchronously. Here is also a post on it.
How can i know how much data is written in php curl .
Here is my code which downloads i.e writes the data to my local server from a remote url . But i want to know how much data has been written till now .
<?php
$url = 'https://speed.hetzner.de/1GB.bin';
$path = $_SERVER['DOCUMENT_ROOT'] . '/1gb.bin';
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
I use this for the downloaded size in bytes (including the size of the file as it is the body of the response) (after you call curl_exec($ch);)
// $ch is the curl handle
$info = curl_getinfo($ch);
echo $info['size_download'];
CURLINFO_SIZE_DOWNLOAD - Total number of bytes downloaded
This is quoted from libcurl documenation
The amount is only for the latest transfer and will be reset again
for each new transfer. This counts actual payload data, what's also
commonly called body. All meta and header data are excluded and will
not be counted in this number.
And this for the size of the request you made with curl in bytes
$info = curl_getinfo($ch);
echo $info['request_size'];
CURLINFO_REQUEST_SIZE - Total size of issued requests, currently only for HTTP requests
you can also use the function with the opt parameter set to one of the function constants, like
echo curl_getinfo($ch, CURLINFO_REQUEST_SIZE);
echo curl_getinfo($ch, CURLINFO_SIZE_DOWNLOAD);
getinfo() function
As you told in comments by Dharman, don't switch off CURLOPT_SSL_VERIFYPEER. If you want to use https requests check this php-curl-https
I'm trying to take an imagepng() object and upload it to a different server using cURL. Here is the code I've got. To be clear, I know the imagepng file works correctly and is being generated because I can save it locally on the server the code is running on. I'm just not sure how to send that info to a new server. All of the variables are set before this code ($fileName, $imageObject, etc.):
$file = imagepng($imageObject, 'newTest'.$counter.'.png');
if($file){
$ch = curl_init();
$fp = $file;
curl_setopt($ch, CURLOPT_URL, 'ftp://'.$ftp_user.':'.$ftp_pass.'#'.$ftp_server.'/'.$fileName.'.png');
curl_setopt($ch, CURLOPT_UPLOAD, 1);
curl_setopt($ch, CURLOPT_INFILE, $fp);
curl_setopt($ch, CURLOPT_INFILESIZE, filesize($file));
curl_exec ($ch);
$error_no = curl_errno($ch);
curl_close ($ch);
if ($error_no == 0) {
$error = 'File uploaded succesfully.';
} else {
$error = 'File upload error.';
}
}
The errors I am getting for every file (this code is in a loop processing multiple files) is. Of course {MY_URL} is replaced with the actual URL of my file:
Warning: curl_setopt(): supplied argument is not a valid File-Handle resource in {MY_URL} on line 43
Warning: filesize() [function.filesize]: stat failed for 1 in {MY_URL} on line 44
So it appears that the file is the wrong format when it's being cURLed. What do I need to set it to in order to send it correctly?
Thanks for your help!
imagepng outputs image into a file,but does not return a file handle (it only returns TRUE in case of success). You need to use something like fopen to get a valid file handle. Try replacing $fp=$file with this:
$fp = fopen('newTest'.$counter.'.png', "rb");
Also, replace filesize($file) with filesize($fp).
In general, $file is just a boolean, not a file handle. Use $fp for every function that expects a file handle. Also, don't forget to close every file at the end of the loop (e.g. add the following line after curl_close):
fclose($fp);
I'm trying to save a local copy of an xml file, and then open it with simple xml, but i'm getting some errors.. here's my code:
$feedURL = "https://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites";
//$xml = file_get_contents("$feedURL");
$xml = file_get_contents($feedURL);
file_put_contents("video.xml", $xml);
// read feed into SimpleXML object
//$sxml = simplexml_load_file($feedURL);
$sxml = simplexml_load_file('video.xml');
The error i'm getting is as follows:
Warning: file_get_contents(https://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites) [function.file-get-contents]: failed to open stream: Result too large in D:\wamp\www\videos2.php on line 48
I'm not sure why it would be too large of a result, it only returns 6kb of xml. what am i doing wrong?
Update:
This is running on a windows platform using WAMP server - not ideal, but i'm stuck with it.
Update 2:
I've tried using curl and fwrite to achieve a similar result, as suggested below, but it won't write the xml file to the local server. It doesn't give me any errors though.
update 3:
This is obviously a very specific problem with the hosting environment, but I'm not sure where to start looking for the problem. Using curl works great on a linux-based dev server, but is causing problems on this windows-based production server. An extra help in troubleshooting this issue would be most appreciated!
Correct answer for the question:
It is possible you are having the same problem as of this question: CURL and HTTPS, "Cannot resolve host" (DNS-Issue)
Other Details:
You can use SimpleXML to load and save the xml data
$xml = new SimpleXMLElement('https://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites', NULL, TRUE);
$xml->asXML('video.xml');
I have tested the code above in a WAMP server and it works fine.
Update:
If the above returns error message "[simplexmlelement.--construct]: I/O warning : failed to load external entity ...." It's possible that your server does not allow to include external data or the php file/script does not have the right permission.
Try the following:
1. echo the content of the xml file.
$xml = new SimpleXMLElement('https://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites', NULL, TRUE);
echo htmlentities($xml->asXML());
If you managed to retrieved the xml content and print it to the browser, then your server is allowing to include external content and most likely the problem with the file permission. Make sure file/script have the right to create xml file.
If the above still does not work try using cURL.
function getPageContent($options)
{
$default = array(
'agent' => $_SERVER['HTTP_USER_AGENT'],
'url' => '',
'referer' => 'http://'.$_SERVER['HTTP_HOST'],
'header' => 0,
'timeout' => 5,
'user' => '',
'proxy' => '',
);
$options = array_merge($default, $options);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $options['url']);
curl_setopt($ch, CURLOPT_HEADER, $options['header']);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
if ($options['proxy'] != '') {
curl_setopt($ch, CURLOPT_PROXY, $options['proxy']);
}
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, 0);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $options['timeout']);
curl_setopt($ch, CURLOPT_REFERER, $options['referer']);
curl_setopt($ch, CURLOPT_USERAGENT, $options['agent']);
if ($options['user'] != '') {
curl_setopt($ch, CURLOPT_PROXYUSERPWD, $options['user']);
}
$result = array();
$result['content'] = curl_exec($ch);
$result['info'] = curl_getinfo($ch);
$result['error'] = curl_error($ch);
curl_close($ch);
return $result;
}
$result = getPageContent(array(
'proxy' => '[ip or address]:[port]', // if needed
'user' => '[username]:[password]', // if needed
'url' => 'http://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites'
));
if (empty($result['error'])) {
// ok
// content of xml file
echo htmlentities($result['content']);
// file
$filename = 'video.xml';
// Open File
if (!$fp = fopen($filename, 'wt')) {
die("Unable to open '$filename'\n\n");
}
// write content to file
fwrite($fp, $result['content']);
// close file
fclose($fp);
} else {
// failed
echo '<pre>';
echo 'Error details;';
print_r ($result['error']);
echo '<hr />Other info:';
print_r ($result['info']);
echo '</pre>';
}
Have you tried using curl and fwrite to get the contents and write them to a local file?
$ch = curl_init("https://gdata.youtube.com/feeds/api/users/manitobachildhealth/favorites");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
fwrite("video.xml",$output);
I'm attempting to pass a key via the $_SERVER['HTTP_REFERER'] (set via cURL) from one server to another, to then process some POST data and send an XML file back. I have the key stored in identical files on both servers (rotating them via cron job SCP) but when I read the key on the receiving end and compare to the referer it fails every time and throws the 418 back to me.
If I use if($refkey = "mykeyhere") instead of if($refkey = $key) it works properly, but obviously hard coding they key isn't going to work here - I need to be able to read it from a file.
I've tried using strstr(), doing a cast with (string)$key however gettype() on both returns string so it should work. The key is being sent properly in $_SERVER['HTTP_REFERER'] and being read in properly from the file on the receiving end, I've echo'd both and they're identical. Does anyone know why the comparison would fail, am I doing something stupidly, and obviously wrong? Thanks.
cURL request build code:
$file = "key";
$fh = fopen($file, 'r');
$key = fread($fh, filesize($file));
fclose($fh);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/xmlreq.php");
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_FORBID_REUSE, 1);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_REFERER, $key);
curl_setopt($ch, CURLOPT_POSTFIELDS, "greeting=hello" );
curl_exec($ch);
curl_close($ch);
receiving code (xmlreq.php):
if($_SERVER['REQUEST_METHOD'] === "POST" ){
$file = "key";
$fh = fopen($file, "r");
$key = fread($fh, filesize($file));
fclose($fh);
$refkey = $_SERVER['HTTP_REFERER'];
if($refkey == $key){
print_r("hello, world!"); // it should run this
}else
header("HTTP/1.0 418 I'm a teapot"); // but runs this instead
}else
header("HTTP/1.0 404 Not Found");
As a first troubleshooting step, on both sides, replace $key with trim($key). Alternatively, on the receiving side, change your if condition to if (trim($refkey) == trim($key)) {
The file may contain a newline/carriage-return after the key, and it may NOT send that extra character in the referrer variable. The receiving side may be trying to compare the referrer with the one from the file, and the file version might have the newline included.