Upload content:// file to server - php

I have uri of file on android phone : content://com.android.contacts/contacts/472/photo. I got it from PhoneGap Contacts plugin. It is actually contact image. Now I need to upload that image on server but when I upload it and try to see it it is not visible.
I managed to google a little bit and found that for file upload I need to have full file:// path.
Now my question is how to convert content:// to file:// or how to upload files with content:// uri on server?
I have tried using window.resolveLocalFileSystemURI(uri, win, fail); and than
function win(fe){
alert("win");
alert(fe.fullPath);
}
function fail(a){
alert('fail');
}
But somehow it just "skipp" window.resolveLocalFileSystemURI(uri, win, fail);. It is never executed... No error in LogCat and no any alert from these 3 ...
Can this be solution of my problem and why no response is returned?

I don't have direct experience with the contacts plugin, but the docs make me think that you are not looking at a filesystem URI, rather an entry in the contacts database.
[http://docs.phonegap.com/en/3.3.0/cordova_contacts_contacts.md.html#contacts.find]
So, you are likely mis-interpreting the concept of a URI, especially in the context of the filesystem. [http://en.wikipedia.org/wiki/Uniform_resource_identifier] and [http://www.html5rocks.com/en/tutorials/file/filesystem/].
From the plugin docs, we do see that 'photos' is a property of each contact.
how to upload files with content:// uri on server?
for the desired contact, you could create a file using the photos property, and then upload using the phonegap file-transfer plugin. [http://docs.phonegap.com/en/3.3.0/cordova_file_file.md.html#FileTransfer]

Related

Lithium PHP Framework - unable to upload files on live site

I'm working on a site, that is live already, and when I attempt to upload a file - I receive "Status Code: 201 Created", but no content after that.
And when I run the site locally and upload the file - I get the same status code, but after that I get content - JSON with the data of the newly created file.
In both cases I see in the Mongo database that the file is created, and when I attempt to access it through a controller for it through
http://({domain}file/{{file-id}}
I see the file - even on live.
The problem appears to be somewhere after that saving ... and before [[something]] returns the JSOn with the file data ...
... but because everything in Lithium is soooooooooooooooo muuuuuuucccchhhhhhhh automated ... I don't know how to find the problem.
(And I don't want to dump inside the framework itself ... I'm supposed to use the framework, not to debug it! ...)
Well, the problem appeared to be that after the upload the server should return JSON with the data for the file, but on live json_encode() was returning false because of some non-UTF8 text in the object. I managed to workaround this ... and the solution can be found in this question I posted today: json_encode() turn non-UTF8 strings into null, but on live site returns false

Saving XLS from Google Docs with PHP

I'm trying to parse XLS files from Google Docs with PHP. It works fine when I manually download a file and then upload it to the server, but when I use PHP to save the exact same XLS file to the server directly, instead of getting all the data in the XLS, the response is:
<b>DOM ELEMENT: </b>HTML<br /><b>ATTRIBUTE: </b>lang => en<br /><b>DOM ELEMENT: </b>HEAD<br /><b>DOM ELEMENT: </b>META<br /><b>ATTRIBUTE: </b>charset => utf-8<br /><b>DOM ELEMENT: </b>META<br /><b>ATTRIBUTE: </b>content => width=300, initial-scale=1<br /><b>ATTRIBUTE: </b>name => viewport<br /><b>DOM ELEMENT: </b>META<br /><b>ATTRIBUTE: </b>name => description<br /><b>ATTRIBUTE: </b>content => Create a new spreadsheet and edit with others at the same time -- from your computer, phone or tablet. Get stuff done with or without an internet connection. Use Sheets to edit Excel files. Free from Google.<br /><b>DOM ELEMENT: </b>TITLE<br />
Here's an example of how I use PHP to save the XLS to the server:
$fileName = 'xls/newday2014.xls';
$xlsURL = 'https://docs.google.com/spreadsheets/d/1KKMiBOlvpKaAJ_MsNfaWGmR6ixL53AjAaLf0R18X3e4/edit#gid=161299136';
file_put_contents($fileName, file_get_contents($xlsURL));
You're missing some fundamental things here with your three liner code:
file_get_contents is not a browser. Any URI (URL) it takes can not have the fragment (in your case #gid=161299136) because this is never send to the server.
The last point also highlights: If you used that exact URI to download with your browser, there is most likely something running in your browser for the download before the correct download URI is created. So you're using the wrong URI to download.
file_get_contents does not log you into google accounts just by magic.
Just making a filename ending with .xls does not change the file-format from HTML magically into an Excel Spreadsheet.
As these are already four fundamental problems with your three lines of code, it should be obvious that the code you're using is unfitable to high degree for what you try to do. I suggest you throw it away and start from scratch doing some research first, e.g. contact the vendor of that webservice for your support options as a PHP developer. Most likely they offer an API for what you're trying to do.
Quickstart: Run a Drive App in PHP (via: How do I read a Google Drive Spreadsheet in PHP?)
Most likely, a cookie is set to your browser, when you log in into Google Docs - this cookie is not present on the file_get_contents($xlsURL) call, so you get different content. The web debugger of your choice will confirm that, so does pasting your URL into a not-logged-in browser.
The cURL extension can hand cookies to the server, but please understand, that this cookie is dynamic - so getting it out of your browser and into cURL is by far not enough. Most likely you will have to walk the complete way from login to the document, including the need to update, whenever Google choses to update.

$_SERVER['HTTP_X_FILENAME'] undefined only on Go Daddy

Hey there so I have sent in a support ticket to go daddy but they aren't being very helpful. In an admin panel for one of my client's websites there is an AJAX image uploader. Normally I don't work with Go Daddy and the script has always worked just fine with other hosts. However this client has forced me into dealing with Go Daddy's mess of a system and I am having trouble getting the AJAX image uploader to work.
The problem seems to be that the $_SERVER['HTTP_X_FILENAME'] is simply not defined, even when a file is posted to PHP using an HTTP request.
In my PHP code I have the following line to grab the file name:
$fn = (isset($_SERVER['HTTP_X_FILENAME']) ? $_SERVER['HTTP_X_FILENAME'] : false);
And on Go Daddy it returns false. Also if I print_r($_SERVER); HTTP_X_FILENAME isn't even defined.
I have checked permissions and everything has the correct permisssions. I have checked the error log and no error is being generated. I have checked the php.ini and file uploads is enabled with a 32mb max (way bigger than any file i'v tested with)
I have no idea where to look now as my google searches have come up with nothing. (And server admin really isn't my forte)
any help would be much appreciated.
thanks!!
Ok, I got this working. Not sure how you manage saving files on pw when using admin. Do they got saved directly to assets/files/id/ folder or is there tmp somewhere?
// We check if it is image upload
$filename = (isset($_SERVER['HTTP_X_FILENAME']) ? $_SERVER['HTTP_X_FILENAME'] : false);
if ($filename) {
$this->message("It is a file coming!");
file_put_contents(
'C:/Temp/' . $filename,
file_get_contents('php://input')
);
} else {
if(!$this->isAjaxPost) throw new WireException("This functionality may only be accessed from AJAX POST at present");
// etc etc...
}
And now I am able to save my files to C:/Temp/. Not sure about the best way to handle this from this situation? Save it to temp folder and use regular pw API and save the file? Or save it right to assets/files/id/ folder and then use API? I can easily send more data (like field name) as a http headers.
If we save it directly to right place (probably safe since this is admin usage), how I get the right path? $config->paths->files->3242 or something like that?

Uploading image to server using byte array to php upload in flex 4.5 air

I'm using Flex 4.5 in a Mobile air for android application.
So, Using the Camera/Camera roll (http://www.unitedmindset.com/jonbcampos/2010/09/29/air-for-android-camera-and-camera-roll/)
I want to do a simple upload using amfphp
Note: Since this is how you get pictures on devices, i cant use the filreference because it wants you to get the pictures using "browse" which cant be done on android or ios
My plan is, after I select a picture or capture one with the camera, in the event I can get the local url to the picture that was taken like this:
file:///mnt/sdcard/DCIM/Camera/IMG_20110531_205113.jpg
I'm putting this in the imageURL var
(i'm assuming i should make it into a byte array to transfer it, im not exactly sure this is my first time making something like this)
Here's how i'm taking that image, making it into a byte array, and using amfphp to send the upload to the server:
protected function upload_btn_clickHandler(event:MouseEvent):void
{
var request:URLRequest = new URLRequest(imageURL);
var urlLoader:URLLoader = new URLLoader(request);
urlLoader.addEventListener(Event.COMPLETE, onURLLoaderComplete);
urlLoader.dataFormat = URLLoaderDataFormat.BINARY;
urlLoader.load(request);
}
private function onURLLoaderComplete(event:Event):void
{
var byteArray:ByteArray;
byteArray = event.target.data;
//send upload using amfphp!
gw.call("MyClass.uploadFile", uploadImageRes, byteArray);
}
Here's my php code:
function uploadFile($fileData) {
$myFilePath = '../../../assets/userphotos/imageone';
preg_replace("/[^0-9]+/","_",microtime()).'_'.$fileData["filename"];
file_put_contents($myFilePath, $fileData["filedata"]);
//eventually add the mysql query to add the image path to mysql server
return true;
}
But i'm getting an error NetConnection.Call.BadVersion.
This is the first time im trying something like this so im not even really sure if im going about it right. All the examples I find online go about using the fileReference class, but that seems to require me to use the "browse" method, and im on a mobile application that uses the camera and camera roll to grab pictures off the device, and im not sure how to incorporate that into the fileref class. I figured i could just get the byte array and send it to php myself and it should be just fine.
My end goal is to be able to upload the image to a folder on the server, and i'll make a mysql update to put the location of the file, i'm not really worried about the mysql part yet, im sure that part will be really easy and i can figure it out. I just wanted to get some help with the actual uploading the image to a directory thing. thanks!
I can at least help you with debugging this.
NetConnection.Call.BadVersion is usually thrown in the client when an error has been thrown in the php script. That makes php output the error which in turn invalidates the amf structure of the response, making flash not recognize the amf format and throw a "Bad version" error.
I recommend you get a packet capture program (like wireshark) and see what the server is outputting, or if you are only getting this error on the mobile device, make a check on the bottom of the php script with error_get_last() and save it in a text file that you can review.
When you see which error php is throwing maybe you can get further with this.

How do I display protected Amazon S3 images on my secure site using PHP?

I am trying to move images for my site from my host to Amazon S3 cloud hosting. These images are of client work sites and cannot be publicly available. I would like them to be displayed on my site preferably by using the PHP SDK available from Amazon.
So far I have been able to script for the conversion so that I look up records in my database, grab the file path, name it appropriately, and send it to Amazon.
//upload to s3
$s3->create_object($bucket, $folder.$file_name_new, array(
'fileUpload' => $file_temp,
'acl' => AmazonS3::ACL_PRIVATE, //access denied, grantee only own
//'acl' => AmazonS3::ACL_PUBLIC, //image displayed
//'acl' => AmazonS3::ACL_OPEN, //image displayed, grantee everyone has open permission
//'acl' => AmazonS3::ACL_AUTH_READ, //image not displayed, grantee auth users has open permissions
//'acl' => AmazonS3::ACL_OWNER_READ, //image not displayed, grantee only ryan
//'acl' => AmazonS3::ACL_OWNER_FULL_CONTROL, //image not displayed, grantee only ryan
'storage' => AmazonS3::STORAGE_REDUCED
)
);
Before I copy everything over, I have created a simple form to do test upload and display of the image. If I upload an image using ACL_PRIVATE, I can either grab the public url and I will not have access, or I can grab the public url with a temporary key and can display the image.
<?php
//display the image link
$temp_link = $s3->get_object_url($bucket, $folder.$file_name_new, '1 minute');
?>
<a href='<?php echo $temp_link; ?>'><?php echo $temp_link; ?></a><br />
<img src='<?php echo $temp_link; ?>' alt='finding image' /><br />
Using this method, how will my caching work? I'm guessing every time I refresh the page, or modify one of my records, I will be pulling that image again, increasing my get requests.
I have also considered using bucket policies to only allow image retrieval from certain referrers. Do I understand correctly that Amazon is supposed to only fetch requests from pages or domains I specify?
I referenced:
https://forums.aws.amazon.com/thread.jspa?messageID=188183&#188183 to set that up, but then am confused as to which security I need on my objects. It seemed like if I made them Private they still would not display, unless I used the temp link like mentioned previously. If I made them public, I could navigate to them directly, regardless of referrer.
Am I way off what I'm trying to do here? Is this not really supported by S3, or am I missing something simple? I have gone through the SDK documentation and lots of searching and feel like this should be a little more clearly documented so hopefully any input here can help others in this situation. I've read others who name the file with a unique ID, creating security through obscurity, but that won't cut it in my situation, and probably not best practice for anyone trying to be secure.
The best way to serve your images is to generate a url using the PHP SDK. That way the downloads go directly from S3 to your users.
You don't need to download via your servers as #mfonda suggested - you can set any caching headers you like on S3 objects - and if you did you would be losing some major benefits of using S3.
However, as you pointed out in your question, the url will always be changing (actually the querystring) so browsers won't cache the file. The easy work around is simply to always use the same expiry date so that the same querystring is always generated. Or better still 'cache' the url yourself (eg in the database) and reuse it every time.
You'll obviously have to set the expiry time somewhere far into the future, but you can regenerate these urls every so often if you prefer. eg in your database you would store the generated url and the expiry date(you could parse that from the url too). Then either you just use the existing url or, if the expiry date has passed, generate a new one. etc...
You can use bucket policies in your Amazon bucket to allow your application's domain to access the file. In fact, you can even add your local dev domain (ex: mylocaldomain.local) to the access list and you will be able to get your images. Amazon provides sample bucket policies here: http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html. This was very helpful to help me serve my images.
The policy below solved the problem that brought me to this SO topic:
{
"Version":"2008-10-17",
"Id":"http referer policy example",
"Statement":[
{
"Sid":"Allow get requests originated from www.example.com and example.com",
"Effect":"Allow",
"Principal":"*",
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::examplebucket/*",
"Condition":{
"StringLike":{
"aws:Referer":[
"http://www.example.com/*",
"http://example.com/*"
]
}
}
}
]
}
When you talk about security and protecting data from unauthorized users, something is clear: you have to check every time you access that resource that you are entitled to.
That means, that generating an url that can be accessed by anyone (might be difficult to obtain, but still...). The only solution is an image proxy. You can do that with a php script.
There is a fine article from Amazon's blog that sugests using readfile, http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server
readfile('s3://my-bucket/my-images/php.gif');
You can download the contents from S3 (in a PHP script), then serve them using the correct headers.
As a rough example, say you had the following in image.php:
$s3 = new AmazonS3();
$response = $s3->get_object($bucket, $image_name);
if (!$response->isOK()) {
throw new Exception('Error downloading file from S3');
}
header("Content-Type: image/jpeg");
header("Content-Length: " . strlen($response->body));
die($response->body);
Then in your HTML code, you can do
<img src="image.php">

Categories