Facebook Marketing API: Upload Image to library provided a URL - php

I am new to the Facebook Marketing API and what I am working on is to upload an image to my FB Marketing library.
Right now, I can successfully upload an image to the FB Marketing but unfortunately I cannot see if there is a way to upload an image from a URL.
Has anyone had any previous experience with that ?
Code sample:
Api::init(getenv('FACEBOOK_APP_ID'), getenv('FACEBOOK_APP_SECRET'), getenv('FACEBOOK_APP_TOKEN'));
/**
* {#inheritdoc}
*/
public function testFunc()
{
$adAccountId = getenv('FACEBOOK_APP_ACCOUNT_ID');
$account = new AdAccount();
$account->{AdAccountFields::ID} = $adAccountId;
$image = new AdImage(null, "act_{$account->id}");
$image->{AdImageFields::FILENAME} = getenv('FACEBOOK_APP_MARKETING_PATH').'fbTestImage.png';
$image->create();
$this->line('Image Hash: '.$image->{AdImageFields::HASH}.PHP_EOL);
}

You have to use PHP to pull the image from the URL write it to a temporary file, upload. that then delete the temporary file when complete (Not ideal but a solution). The the Node.js Marketing API you can supply the content rather than the filename to the API which is nice.
However a possible solution for now.
$url = "https://www.sample/image.png";
// Create output file name
$arr = explode("/",$url);
$img_file = dir(__FILE__).'/tmp_'.$arr[count($arr)-1];
// Load File contents from URL
$data = file_get_contents($url);
// Write Temporary File
$fp = fopen($img_file,"w");
fwrite($fp,$data);
fclose($fp);
// Then Push the temp file to Facebook
// Rest of FB Code...
$image->{AdImageFields::FILENAME} = $img_file;
// Wait for the Upload to complete then delete the temp file
unlink($img_file);

Related

How to track the uploaded size of a file in api laravel 8

I am developing a file system management using laravel 8.
I created a function that accept file.
public function uploadExperiment(Request $request)
{
$file = $request->file('file');
$uniqueId = time();
$fileName = $file->getClientOriginalName();
$filePath = $uniqueId . '/' . $fileName;
Storage::disk('local')->put($filePath, file_get_contents($file));
return response()->json(["success"=>true]);
}
I need to track the uploaded size during the uploading process and this progress I need to save it in a database.
For example, If I want to upload a file with 2GB of size, It will takes time. So any idea about how to track it?
maybe do you want create this:
https://www.itsolutionstuff.com/post/php-laravel-file-upload-with-progress-bar-exampleexample.html
to get size file upload in laravel use:
$size = $request->file('file')->getSize();
maybe could use this for get % of your upload you can use jquery. but if you want to do in php, with this example i think that you have one idea that you could to do:
function timeout_trigger() {
var loaded = file.getLoaded(); // loaded part
p = parseInt(loaded / size);
$(".progress").css("max-width",p+"%");
$(".progress-view").text(p+"%");
if(p!=100) {
setTimeout('timeout_trigger()', 20);
}
}
timeout_trigger();
i think that you have one form and you send it with POST for this use this function.

How to transfer single file to Google Cloud Storage Through Script

So I'll preface this by stating that my boss is asking me to set up a Google Cloud Storage back up process without having the Google Bucket instance or the .json key file for Google Cloud Platform. Basically, I'm writing a script to batch the process to our Linux and Windows instances of Google Cloud Platform machines to take the back up file that will be stored daily, send it to the Google Cloud Storage, then the delete the file from the system (so that the backup file doesn't take up space on the instance). Since all of this is relatively newer to me, I was hoping to get some advice on whether the key (.json private key) is needed for this process and how to include it; obviously, I don't have the key but I want to add a placeholder until we set that up.
I found Google's documentation pretty helpful but a tad overwhelming. Here is the link to what I have used for reference: https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-code-sample
Here is my code:
<?php
require_once 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
// Set up storage client with keyfilepath or .json key necessary?
// Set the parameters for Google cloud bucket name, filename, and path to file.
$bucketName = "fishbowl-storage";
$fileNameA = "backup_filename";
$fileNameB = "quickbook_filename";
$dataBackupSource = "";
$dataQBBackupSource = "";
// Set the function that will upload the file to google Cloud Storage.
function upload_object($bucketName, $fileName, $source)
{
// Create the storage client and store the file in the cloud.
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$file = $bucket->upload($file, [
'name' => $fileName
]);
// Print the success message to the user.
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $fileName);
}
// Check the OS to determine whether Linux or Windows.
if(php_uname('s') == "Linux"){
// Set the file path to grab back up.
$dataBackupSource = "../Fishbowl/backup";
$dataQBBackupSource = "../Fishbowl/otherbackup";
// Use try/catch statement to catch issues with private key.
try{
// run the function to place both backup files in the Google Cloud Storage Bucket instance.
upload_object($bucketName, $fileNameA, $dataBackupSource);
upload_object($bucketName, $fileNameB, $dataQBBackupSource);
// Delete the old file stored in the path.
if(!unlink($fileNameA)){
print ("The file cannot be deleteted or isn't present in this directory...");
}
else{
print ("The file " . $fileNameA . " has been deleted.");
}
}
catch (Exception $e){
print "This process failed for the following reason: " . $e;
return false;
}
}
else if(php_uname('s') == "Windows NT"){
// Set the file path to grab back up.
$dataBackupSource = "../Fishbowl/backup";
$dataQBBackupSource = "../Fishbowl/otherbackup";
// Use try/catch statement to catch issues with private key.
try{
// run the function to place both backup files in the Google Cloud Storage Bucket instance.
upload_object($bucketName, $fileNameA, $dataBackupSource);
upload_object($bucketName, $fileNameB, $dataQBBackupSource);
// Delete the old file stored in the path.
if(!unlink($fileNameA)){
print ("The file cannot be deleteted or isn't present in this directory...");
}
else{
print ("The file " . $fileNameA . " has been deleted.");
}
}
catch (Exception $e){
print "This process failed for the following reason: " . $e;
return false;
}
}
else{
print "The operating system has another name...";
}
?>
TLDR: Do I need the JSON Key to be included in my script to pass a file to the Google Cloud Storage and, if so, where do I put it?
I think that this is the documentation that you are looking for. Basically you need to use the Cloud Client Libraries for the Google Cloud Storage API.These are the steps that you need to follow
1.Installing the client library
composer require google/cloud-storage
2.Setting up authentication. In this step you will create the service account and the Key
After setting the environment variable, you don't need to explicitly specify your credentials in code when using a Google Cloud client library as is mention here
3.Use the Client library. In this case you want to upload a file.
use Google\Cloud\Storage\StorageClient;
/**
* Upload a file.
*
* #param string $bucketName the name of your Google Cloud bucket.
* #param string $objectName the name of the object.
* #param string $source the path to the file to upload.
*
* #return Psr\Http\Message\StreamInterface
*/
function upload_object($bucketName, $objectName, $source)
{
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source),
$bucketName, $objectName);
}

Dropbox account as Wordpress uploads folder

As I'm developing a site with the potential of holding a huge number of images I'm wondering if I can set the standard WordPress uploads folder to a Dropbox account?
And if so, how do I do that?
It would be great when I can implement it in a maner that even WordPress don't know it is a 'remote' folder. Media upload should work the same way as in a native WordPress setup.
I've read about the possibility to use another folder instead of wp-content/uploads but I could not find any info about using Dropbox for this.
Yes, you can do it. As long as you keep the same structure on Dropbox and save Dropbox shareable links as meta data for both the original file and generated sizes, a simple yet fully working setup for it would be something like the following, by using thephpleague/flysystem with their Dropbox Adapter:
Step 1
Add a file called composer.json to the root of your theme with this content:
{
"require": {
"league/flysystem": "^1",
"league/flysystem-dropbox": "^1"
}
}
Step 2
Install Composer by following these instructions
Step 3
Using the command line on your terminal/console, go to your theme directory and run:
composer install -o
Step 4
Create Dropbox App here.
I suggest that you select "App folder" as the type of access.
A directory matching the name of you app will be created on the "Apps" directory at the root of your Dropbox account; This will be your "uploads" directory on Dropbox.
Step 5
Go to your app admin page and generate a new access token.
Save the access token somewhere and also copy the "App secret"
Step 6
Add the following to your functions.php:
use League\Flysystem\AdapterInterface;
use League\Flysystem\Adapter\Local as LocalAdapter;
use League\Flysystem\Dropbox\DropboxAdapter;
use League\Flysystem\Filesystem;
use League\Flysystem\MountManager;
use Dropbox\Client as DropboxClient;
// Autoload vendors
require_once __DIR__ .'/vendor/autoload.php';
/**
* Class that will handle uploading to Dropbox
*
*/
class SO40950172Filesystem {
/**
* Contains several mounted filesystems
*
* #var League\Flysystem\MountManager object
*/
protected $filesystem;
/**
* Contains Dropbox client
*
* We need this accessible to create shareable links
*
* #var Dropbox\Client object
*/
protected $dropbox_client;
/**
* Instantiates this class
*/
public function __construct() {
// Get WordPress uploads directory info
$uploads_info = wp_upload_dir();
// Create Local filesystem
$local_adapter = new LocalAdapter($uploads_info['basedir']);
$local_fs = new Filesystem($local_adapter, [
'visibility' => AdapterInterface::VISIBILITY_PUBLIC
]);
// Create Dropbox filesystem
$this->dropbox_client = new DropboxClient($app_access_token, $app_secret);
$dropbox_adapter = new DropboxAdapter($this->dropbox_client, '/');
$dropbox_fs = new Filesystem($dropbox_adapter, [
'visibility' => AdapterInterface::VISIBILITY_PUBLIC
]);
// Set filesystem manager
$this->filesystem = new MountManager([
'local' => $local_fs,
'dropbox' => $dropbox_fs
]);
}
/**
* Uploads file to Dropbox
*
* #param string $path Path to file
* #return object Current object
*/
public function uploadToDropbox($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Get content from the local file
$content = $this->read("local://$path");
// Push file to Dropbox
$this->put("dropbox://$path", $content);
return $this;
}
/**
* Deletes file from Dropbox
*
* #param string $path Path to file
* #return object Current object
*/
public function deleteFromDropbox($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Delete file from Dropbox
$this->delete("dropbox://$path");
return $this;
}
/**
* Returns the unique identifier path section of a Dropbox URL
*
* #param string $path Path to file
* #return string Dropbox URL unique identifier
*/
public function getDropboxUrlID($path)
{
// Normalize path
$path = static::getNormalizedPath($path);
// Get unique link
$url = $this->dropbox_client->createShareableLink("/$path");
// Parse URL to retrive its path
$url_info = parse_url($url);
$url_path = $url_info['path'];
// Remove "s/" section and file name from the URL path
$id = str_replace(['s/', basename($path)], '', $url_path);
// Return Dropbox unique identifier for this file URL
return trim($id, '/');
}
/**
* Returns clean & relative paths
*
* #param string $path Raw path
* #return string Parsed path
*/
public static function getNormalizedPath($path)
{
// Get WordPress uploads directory info
$uploads_info = wp_upload_dir();
// Remove uploads base path so that we end up
// with the "/YYYY/MM/filename.extension" format
$path = str_replace($uploads_info['basedir'], '', $path);
// Remove uploads base url so that we end up
// with the "/YYYY/MM/filename.extension" format
$path = str_replace($uploads_info['baseurl'], '', $path);
// Remove forward slashes on both ends
$path = trim($path, '/');
// Return path
return $path;
}
/**
* Making sure all calls go to $this->filesystem
*
* #param string $name Method name
* #param array $args Method arguments
* #return mixed
*/
public function __call($name, array $args)
{
if (method_exists($this->filesystem, $name))
throw new \Exception("\League\Flysystem\MountManager doesn't have \"$name\" method");
return call_user_func_array([$this->filesystem, $name], $args);
}
}
// Manipulate media URLs sitewide
add_filter('wp_get_attachment_url', 'so_40950172_get_dropbox_url', 9, 2);
function so_40950172_get_dropbox_url($absolute_url, $post_id) {
// Get normalized path
$path = SO40950172Filesystem::getNormalizedPath($absolute_url);
// Get only the filename
$path = basename($path);
// Get Dropbox URL unique ID
$id = get_post_meta($post_id, 'dropbox_id_'. $path, true);
// Return absolute URL
return $id ? "https://dl.dropboxusercontent.com/s/$id/$path/?dl=0" : $path;
}
// Upload new and updated files to Dropbox
add_filter('wp_update_attachment_metadata', 'so_40950172_upload_to_dropbox', 9, 2);
function so_40950172_upload_to_dropbox($data, $post_id) {
// Get filesystem
$fs = new SO40950172Filesystem();
// Upload original file to Dropbox
$fs->uploadToDropbox($data['file']);
// Add Dropbox URL unique ID to meta data
add_post_meta($post_id, 'dropbox_id_'. basename($data['file']), $fs->getDropboxUrlID($data['file']));
// Upload intermediate image sizes
if (isset($data['sizes']) && $data['sizes']) {
// Get year and month prefix (e.g /YYYY/MM) from original file
$base_path = dirname($data['file']);
// Loop through all sizes
foreach ($data['sizes'] as $size_name => $size_data) {
// Set path for current size
$size_path = $base_path .'/'. $size_data['file'];
// Upload size to Dropbox
$fs->uploadToDropbox($size_path);
// Add Dropbox URL unique ID to meta data
add_post_meta($post_id, 'dropbox_id_'. basename($size_path), $fs->getDropboxUrlID($size_path));
}
}
return $data;
}
// Delete Dropbox file
add_filter('wp_delete_file', 'so_40950172_delete_dropbox_file');
function so_40950172_delete_dropbox_file($absolute_path) {
// Get filesystem
$fs = new SO40950172Filesystem();
// Delete file from Dropbox
$fs->deleteFromDropbox($absolute_path);
}
Step 7
On the code your just pasted into functions.php:
Replace $app_access_token with the Dropbox app access token you generated
Replace $app_secret with the Dropbox app secret
NOTES
The original file and generated sizes will also be saved locally, but you don't need to worry about them. You can even delete the local file after confirmation of a successful upload, if you want and/or care about disk space.
I also tested the built-in image editor and it worked without any issues.
If you ever need to move things on the Dropbox side and since there is no info saved on the database (this is just fine), all you need to do is to update the functions above.
Apparently you can mirror the WordPress structure on Dropbox, but you can't simply link to them using a base URL and the WordPress uploads structure to get the URLs, you really need to get the shareable link for each original file and generated sizes and store something about them as metadata. On the code above I chose to only store the unique part of the URL as metadata, which really is the only unique thing about them.
I know this is off-topic, but I would recommend either AWS S3 or Google Cloud Storage because you can access your files with the exact same file structure you have on Dropbox. No need to save anything as meta data.
that's an interesting idea, but WordPress uses DB relations in order to manage the uploaded images - so i think you must handle the images through the media uploader. TL:DR; - you can't.

Why after I uploaded an image to Azure I cannot access it?

I'm developing a Universal App for Windows Phone 8.1 but I'm using a PHP Page to get recognize some patterns from an Image that I uploaded to my service.
I have discovered that after I uploaded X image to Azure, I cannot use it. I'm using WebMatrix to develop my PHP Page and when I refresh it, it doesn't show me the images that I uploaded however when I try to publish something and I select the option: "Delete files on the remote server that are not on my computer." I can see my images. This is an example of my PHP code:
$uploaddir = getcwd();
$uploadfile = $uploaddir . "/" . basename($_FILES['userfile']['name']);
if (move_uploaded_file($_FILES['userfile']['tmp_name'], $uploadfile)) {
chmod($uploadfile, 0755);
$Test = new Display($_FILES['userfile']['name']);
echo '{"result": "' . $Test->getNumber($_REQUEST['color'], false) . '"}';
//unlink($uploadfile);
} else {
echo '{"result": "-1"}';
}
I'd like to know what could be my bug because I don't understand why I can access from the URL, too to the bit I cannot use it, maybe it's how I assigned the permissions but with or without the chmod, it doesn't change at all. I have even tried other hostings and the problem is the same when I enter the File Manager, there are only my PHP files and it doesn't allow me to manage the image.
This is my Windows Phone code to upload the Image if it's necessary:
byte[] ConvertBitmapToByteArray()
{
WriteableBitmap bmp = bitmap;
using (Stream stream = bmp.PixelBuffer.AsStream())
{
MemoryStream memoryStream = new MemoryStream();
stream.CopyTo(memoryStream);
return memoryStream.ToArray();
}
}
public async Task<string> Upload()
{
try
{
using (var client = new HttpClient())
{
using (var content =
new MultipartFormDataContent())
{
byte[] data = ConvertBitmapToByteArray();
using (var stream = new InMemoryRandomAccessStream())
{
// encoder *outputs* to stream
var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.BmpEncoderId, stream);
// encoder's input is the bitmap's pixel data
encoder.SetPixelData(BitmapPixelFormat.Bgra8, BitmapAlphaMode.Straight,
(uint)bitmap.PixelWidth, (uint)bitmap.PixelHeight, 96, 96, data);
await encoder.FlushAsync();
content.Add(new StreamContent(stream.AsStream()), "userfile", fileNewImage);
using (
var message =
await client.PostAsync("http://xplace.com/uploadtest.php", content))
{
var input = await message.Content.ReadAsStringAsync();
return input;
}
}
}
}
}
catch (Exception ex)
{
return null;
}
}
Thanks for your worthy knowledge and experience.
Create a blob storage account, and add a public container. In your action to save the file, store the file in you blob storage container. Then you can access the image as you would with any other image.
Here is a tutorial on Azure: http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs/
Also, you cannot create folders in a container, but you could use a naming convention on the blobrefname to create the idea of a container. Also, you can attach a domain to the cloud service if you want the URL to have a certain look.
READ YOUR QUESTION AGAIN - And it looks like it's more on the client side.
Here is what I usually do to attach a file to a MultipartFormDataContent:
MultipartFormDataContent content = new MultipartFormDataContent();
FileInfo info = new FileInfo(currFileLoc);
string contentMediaType = null;
//This is a Dictionary<string, string> that takes in the file
//extension, and returns the media type (e.g. "image/jpeg")
GlobalVariables.ApprovedMediaTypes.TryGetValue(
info.Extension.ToLower()
, out contentMediaType);
//If the dictionary doesn't return a result
//then it's not a supported file type
if (contentMediaType == null)
throw new Exception(
String.Format("The file \"{0}\" is an unsupported file type."
, info.Name));
ByteArrayContent currFile = new ByteArrayContent(File.ReadAllByte(currFileLoc));
currFile.Headers.ContentType = new MediaTypeWithQualityHeaderValue(contentMediaType);
content.Add(currFile, currFileLoc, currFileLoc);
The I make my call. Maybe you found another option with blob storage. Finally, if you load large files, you may want to look into uploading in chunks.

Google drive file upload and share it to public access

I am uploading files to google drive account, what i want to do is to make it readable, but not editable to the public. How can i do that?
$file = new Google_Service_Drive_DriveFile();
$file->title = $_POST['name'];
$file->shared = "public"; //i need to do this
$file->folder = "/somefolder/"; // and this
[optional]
It's actually 3 in 1 question: I also want to place that file into a specific folder on a drive, and i whant to know how to create that folder on google drive.
Might be useful for newbies:
<?php
$client = new Google_Client();
// SET Client ID, Client Secret, Tokens as per your mechanism
?>
Step 1:
Create the folder. The API will return the ID for the folder created.
Step 2:
Upload the file in to the folder with setParents([]);
Example:
<?php
// Upload the file to efgh folder which is inside abcd folder which is in the root folder
$file = new Google_Service_Drive_DriveFile();
$file->setParents(["abcd123", "efgh456"]); // abcd123 and efgh456 are IDs returned by API for the respective folders
$service = new Google_Service_Drive($client);
$service->files->create($file);
?>
$fileID = ""; // file ID returned by Drive.
Step 3:
Setting the permissions to shareable:
<?php
$permissionService = new Google_Service_Drive_Permission();
$permissionService->role = "reader";
$permissionService->type = "anyone"; // anyone with the link can view the file
$service->permissions->create($fileID, $permissionService);
?>
Reference: https://developers.google.com/drive/api/v3/reference/permissions/create

Categories