I use this official Azure Blob Storage sample file.
It shows how to upload HelloWorld.txt.
It is successfully uploaded. However, if the one point changes, for example HelloWorld.txt to Hello.jpg, it uploads 0 Byte file.
My code, phpQS.php is
// Create blob client.
$blobClient = BlobRestProxy::createBlobService($connectionString);
$fileToUpload = "Hello.jpg";
if (!isset($_GET["Cleanup"])) {
// Create container options object.
$createContainerOptions = new CreateContainerOptions();
...
}
Could you tell me how to upload a file completely?
This is the output in Webbrowser (with HTML).
Azure Console is
The sample project has issue, that is why, it doesn't work.
https://github.com/Azure-Samples/storage-blobs-php-quickstart/issues/6
Related
I'm trying to upload a PDF document from a stage server to a remote location using $sftp-put();
CODE:
$sftp = new SFTP($config::SFTP_SERVER);
// login to remote server
if (!$sftp->login($config::SFTP_USER, $config::SFTP_PASSWORD)) {
throw new Exception('Login failed');
}
// move to relevant directory
$sftp->chdir('fatca');
// upload file
$uploadFile = $sftp->put('test-pdf-upload.pdf', '/srv/www/vhosts/stage.johno.com/fatca/src/uploads/pdfs/345-553453-434__05122017_16:45:26.pdf', NET_SFTP_LOCAL_FILE);
// Error checking for local env only
var_dump($uploadFile);
var_dump($sftp->getSFTPLog());
I'm expecting to view the same PDF, that contains user data and some user uploaded images. I've also confirmed that the original PDF has been created successfully on the staging server, it is intact and shows the relevant information.
The resulting file is created in the new remote server location however it is damaged/unreadable.
The output from var_dump($sftp->getSFTPLog()); is not encouraging either:
bool(false)
What am I doing wrong here? Feel like I've followed the phpseclib documentation well... Although its been one of those long, long days in front of the screen!
Any advice greatly appreciated as always.
You're using phpseclib 2.0. I can tell because you're doing new SFTP() instead of new Net_SFTP(). For 2.0 you need to do SFTP::SOURCE_LOCAL_FILE. eg.
$uploadFile =
$sftp->put(
'test-pdf-upload.pdf',
'/srv/www/vhosts/stage.johno.com/fatca/src/uploads/pdfs/345-553453-434__05122017_16:45:26.pdf',
SFTP::SOURCE_LOCAL_FILE);
I have a Django app that contains a Video-on-Demand feature. It's powered by Azure Media Services (AMS). When a user uploads a video, I first save the video in an Azure storage blob, and then I use a PHP script (which utilizes the AMS php sdk) to encode the said video and prep a streaming URL (hosted on AMS).
My problem is this: how do I get the dimensions of the video? I need to know the height and width so that I can encode the video to lower res formats on AMS. I can't get the dimensions from python since I'm not uploading the video file onto a local server first (where my web server is running). What are my options? Please advise.
As you are using AMS SDK for PHP to create AMS task, and which requires the video asset file. You can leverage the PHP module http://getid3.sourceforge.net/ to get the info of video asset during the PHP process with a ease.
You can download the PHP module http://getid3.sourceforge.net/ and extract to your php application's folder, and you can use the following code snippet to get the dimensions of video asset:
require_once('./getid3/getid3.php');
$filename="<video_path>";
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($filename);
var_dump($ThisFileInfo['asf']['video_media']);
Any further concern, please feel free to let me know.
update using remotefile on Azure Storage
Here is a code sample, leveraging which, you can use the SAS url of blobs on Azure Storage. It will download the file to server folder, and detect the info, and then delete the template file.
$remotefilename = '<SAS Url>';
if ($fp_remote = fopen($remotefilename, 'rb')) {
$localtempfilename = tempnam('/tmp', 'getID3');
if ($fp_local = fopen($localtempfilename, 'wb')) {
while ($buffer = fread($fp_remote, 8192)) {
fwrite($fp_local, $buffer);
}
fclose($fp_local);
// Initialize getID3 engine
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($localtempfilename);
// Delete temporary file
unlink($localtempfilename);
}
fclose($fp_remote);
var_dump($ThisFileInfo);
}
I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy
Currently I'm creating an app using IntelXDK to upload image from devices to my server.
The problem currently I'm encountering is, how to code my backend so that it can receive the upload file from mobile device?
In PHP, I only know that the file upload requires:
<input type="file" name="file" />
then use $FILES["file"] to save it into storage
And is almost similar in .Net as well.
But I'm still couldn't think of how to receive the file once it is uploaded via mobile.
Would be great if someone share or advise the missing part (.Net and PHP).
In ASP.net server side use webservice receive in byte format and save that as you want.
Code sample refrence link http://www.codeproject.com/Articles/22985/Upload-Any-File-Type-through-a-Web-Service
[WebMethod]
public string UploadFile(byte[] f, string fileName)
{
// the byte array argument contains the content of the file
// the string argument contains the name and extension
// of the file passed in the byte array
try
{
// instance a memory stream and pass the
// byte array to its constructor
MemoryStream ms = new MemoryStream(f);
// instance a filestream pointing to the
// storage folder, use the original file name
// to name the resulting file
FileStream fs = new FileStream
(System.Web.Hosting.HostingEnvironment.MapPath
("~/TransientStorage/") +
fileName, FileMode.Create);
// write the memory stream containing the original
// file as a byte array to the filestream
ms.WriteTo(fs);
// clean up
ms.Close();
fs.Close();
fs.Dispose();
// return OK if we made it this far
return "OK";
}
catch (Exception ex)
{
// return the error message if the operation fails
return ex.Message.ToString();
}
}
}
}
For more information about uploading files to a server: https://software.intel.com/en-us/node/493213
If you are aware of ASP.NET Web API 2, have a look at this sample:
http://aspnet.codeplex.com/sourcecontrol/latest#Samples/WebApi/FileUploadSample/
Also, check these ones:
http://damienbod.wordpress.com/2014/03/28/web-api-file-upload-single-or-multiple-files/
http://www.asp.net/web-api/overview/working-with-http/sending-html-form-data,-part-2
http://www.c-sharpcorner.com/UploadFile/2b481f/uploading-a-file-in-Asp-Net-web-api/
Check these SO links:
How To Accept a File POST
File upload Jquery WebApi
I think, with above links, you will be surely able to create service that intakes file uploaded from a form.
Hope it helps you...
All the best...
I am facing the task of having to upload a snapshot to the server. But I don't want the user to download the image to their computer.
I have explored a few solutions of generating an image serverside with PHP, but they all seem to use a method where the server sends the image to the user.
See for instance: http://mattkenefick.com/blog/2008/11/06/saving-jpegs-with-flash/
I'm wondering if it's possible to save $GLOBALS["HTTP_RAW_POST_DATA"], which in that example contains the ByteArray sent by Flash, to the server as an image file....
Use php code that is along these lines to save the contents of $GLOBALS["HTTP_RAW_POST_DATA"]
// untested code
$imageBytes = $GLOBALS["HTTP_RAW_POST_DATA"]
// in real code you better create a new file for every upload :-)
$file = fopen("uploads/test.jpg", "w");
if(!fwrite($file, $imageBytes)){
return "Error writing to file: $file";
}
fclose($file);