What I have till now
Right now I have a working oauth2 authentication between a laravel user and the dropbox API. Every user can upload files to their personal folder.
The Problem
After Uploading a file with laravel with the Dropbox API v2 I can see that there is a empty (0 Bytes) file uploaded.
Used to accomplish this task:
Laravel
Guzzle
Dropbox API Library
What am I missing?
The Code
My function for processing a form looks like this:
$formFile = $request->file('fileToUpload');
$path = $formFile->getClientOriginalName();
$file = $formFile->getPathName();
$result = Dropbox::files()->upload($path, $file);
return redirect('dropboxfiles');
And my files->upload function in my Dropbox Library looks like this:
$client = new Client;
$response = $client->post("https://content.dropboxapi.com/2/files/upload", [
'headers' => [
'Authorization' => 'Bearer '.$this->getAccessToken(),
'Content-Type' => 'application/octet-stream',
'Dropbox-API-Arg' => json_encode([
'path' => $path,
'mode' => 'add',
'autorename' => true,
'mute' => true,
'strict_conflict' => false
])
],
'data-binary' => '#'.$file
]);
The file, as I said, gets uploaded successfully. Correct name, but 0 Bytes. So empty file.
Thank you so much in advance for your help!
Update
With the following code I made it work. My question is though if there is a better "Laravel-Like" Solution instead of using fopen?
$response = $client->post("https://content.dropboxapi.com/2/files/upload", [
'headers' => [
'Authorization' => 'Bearer '.$this->getAccessToken(),
'Dropbox-API-Arg' => json_encode([
'path' => $path,
'mode' => 'add',
'autorename' => true,
'mute' => true,
'strict_conflict' => false
]),
'Content-Type' => 'application/octet-stream',
],
'body' => fopen($file, "r"),
]);
How #Greg mentioned (see cross-linking reference) I was able to solve this issue by using
'body' => fopen($file, "r"),
instead of
'data-binary' => '#'.$file
This is, how Greg mentioned, because data-binary is used in Curl requests. Other HTTP Clients, like Guzzle in my case use different names.
Related
I have the following Postman request for testing a third party API;
What I am trying to do is convert this into code using Laravel's HTTP class, the code i currently have is;
public function uploadToThridParty()
{
$uploadContents = [
'id' => 'this-is-my-id',
'fileUpload' => true,
'frontfile' => Storage::get('somefrontfile.jpg'),
'sideview' => Storage::get('itsasideview.png'),
];
$request = Http::withHeaders(
[
'Accept' => 'application/json',
]
);
$response = $request
->asForm()
->post(
'https://urltoupload.com/upload', $uploadContents
)
}
But every time I run this, the 3rd party API comes back with Invalid ID, even though if i use Postman with the same ID it works fine.
I cant seem to figure out where i am going wrong with my code;
As #Cbroe mention about attach file before sending post request you can make this like this example:
public function uploadToThridParty()
{
$uploadContents = [
'id' => 'this-is-my-id',
'fileUpload' => true
];
$request = Http::withHeaders(
[
'Accept' => 'application/json',
]
);
$response = $request
->attach(
'frontfile', file_get_contents(storage_path('somefrontfile.jpg')), 'somefrontfile.jpg'
)
->attach(
'sideview', file_get_contents(storage_path('itsasideview.png')), 'itsasideview.jpg'
)
->post(
'https://urltoupload.com/upload', $uploadContents
)
}
Also i think you need remove asForm method because it's override your header accept type to application/x-www-form-urlencoded that is way your exception is Invalid ID
Some third party API would require you to have the request with content type as multipart/form data
you can double check all the headers being pass on your postman request HEADERS tab and view on Hidden headers.
If you indeed need your request to be in multipart/form-data, You can use the multipart options of guzzle.
Although this doesnt seem to be on Laravel HTTP-Client docs, you can simply pass a asMultipart() method in your HTTP request
just check the /vendor/laravel/framework/src/Illuminate/Support/Facades/Http.php for full reference of HTTP client.
You can have your request like this.
public function uploadToThridParty() {
$uploadContents = [
[
'name' => 'id',
'contents' => 'this-is-my-id'
],
[
'name' => 'fileUpload',
'contents' => true
],
[
'name' => 'frontfile',
'contents' => fopen( Storage::path( 'somefrontfile.jpg' ), 'r')
],
[
'name' => 'sideview',
'contents' => fopen( Storage::path( 'itsasideview.jpg' ), 'r')
],
];
$request = Http::withHeaders(['Accept' => 'application/json']);
$response = $request->asMultipart()->post('https://urltoupload.com/upload', $uploadContents );
}
I've obtained the following code from searching about the topic
Route::get('/test', function () {
//disable execution time limit when downloading a big file.
set_time_limit(0);
$fs = Storage::disk('local');
$path = 'uploads/user-1/1653600850867.mp3';
$stream = $fs->readStream($path);
if (ob_get_level()) ob_end_clean();
return response()->stream(function () use ($stream) {
fpassthru($stream);
},
200,
[
'Accept-Ranges' => 'bytes',
'Content-Length' => 14098560,
'Content-Type' => 'application/octet-stream',
]);
});
However when I click play on the UI, it takes a good four seconds to start playing. If I switch the disk to local though, it plays almost instantly.
Is there a way to improve the performance or, read the stream by range as per request?
Edit
My current DO config is as per below
'driver' => 's3',
'key' => env('DO_ACCESS_KEY_ID'),
'secret' => env('DO_SECRET_ACCESS_KEY'),
'region' => env('DO_DEFAULT_REGION'),
'bucket' => env('DO_BUCKET'),
'url' => env('DO_URL'),
'endpoint' => env('DO_ENDPOINT'),
'use_path_style_endpoint' => env('DO_USE_PATH_STYLE_ENDPOINT', false),
But I find two type of integration online one specifying the CDN endpoint and one doesn't. I am not sure which one is relevant, though the one that specifies CDN is for Laravel 8 and I am on Laravel 9.
I had to change my code such that:
I had to use the php SDK client for connecting to Aws for the Laravel API isn't flexible to allow passing additional arguments (at least I haven't found anything while researching)
Change to streamDownload as I can't see any description to the stream method in the docs despite that it is present in code.
So the below code allows to achieve what I was aiming for which is, download by chunk based on the range received in the request.
return response()->streamDownload(function(){
$client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => config('filesystems.disks.do.region'),
'endpoint' => config('filesystems.disks.do.endpoint'),
'credentials' => [
'key' => config('filesystems.disks.do.key'),
'secret' => config('filesystems.disks.do.secret'),
],
]);
$path = 'uploads/user-1/1653600850867.mp3';
$range = request()->header('Range');
$result = $client->getObject([
'Bucket' => 'wyxos-streaming',
'Key' => $path,
'Range' => $range
]);
echo $result['Body'];
},
200,
[
'Accept-Ranges' => 'bytes',
'Content-Length' => 14098560,
'Content-Type' => 'application/octet-stream',
]);
Note:
In a live scenario, you would need to cater for if range isn't specified, the content length will need to be the actual file size
When range is present however, the content length should then be the size of the segment being echoed
I am uploading a file to my API built on laravel. In my code below, the data is sent to the API. But how can i get the file that was uploaded to the API ?
Controller
$file = $request->file('imported-file');
$name = $file->getClientOriginalName();
$path ='/Users/desktop/Folder/laravelapp/public/Voice/';
$client = new \GuzzleHttp\Client();
$fileinfo = array(
'message' => 'Testing content',
'recipient' => "102425",
);
$res = $client->request('POST', $url, [
'multipart' => [
[
'name' => 'FileContents',
'contents' => file_get_contents($path . $name),
'filename' => $name
],
[
'name' => 'FileInfo',
'contents' => json_encode($fileinfo)
]
],
]);
API Controller
if (!empty('FileInfo')) {
return response()->json([
'status' => 'error',
'file_content' => $request->file('FileContents'),
'media'=>$request->hasFile('FileContents'),
]);
}
This is the response, i get "file":null,"file_content":{},"media":true
Why is the file content empty when media is showing true meaning there is a file ?
There can be multiple reasons :
Check your content type. Sometimes we forget to specify multipart/form-data and the request is sent with generic application/x-form-urlencoded etc.
Secondly if you are using jQuery, then set :
contentType: false,
processData: false,
cache: false
This will help your form request no getting converted to string payload and not forcing ajax to set content type. try using FormData.
Update : Found similar answer in this post has detailed answer for this.
I am now using Guzzle to send multiple files uploaded by users to another server.
From my understanding, we can send files using Guzzle by using this code:
'multipart' => [
[
'name' => 'FileContents',
'contents' => file_get_contents($path . $name),
'filename' => $name
],
[
'name' => 'FileInfo',
'contents' => json_encode($fileinfo)
]
],
However, as the number of the files uploaded may not be the same and the code above can be used to send only one file. How can I send the unknown number of files using Guzzle? Thank you very much!
It depends on the server you sending request to. Is it able to handle an array inside FileContents and FileInfo? Or something like FileContents_1, FileContents_2, FileContents_N? It depends on your server.
Anyway in Guzzle you can easily fill the array with a loop:
$multipart => []
foreach ($files as $file) {
$multipart[] = [
'name' => 'FileContents[]',
'contents' => fopen($path . $name, 'r'),
'filename' => $name
];
$multipart[] = [
'name' => 'FileInfo[]',
'contents' => json_encode($fileinfo)
]
}
P.S.
Also it's better to use fopen() instead of file_get_contents(), because you don't need to read the files in your script, only to pass them to Guzzle.
I am really stuck in this, and need expert help. Let me explain what I am trying achieve and the setup. I had a script which posts over a https form using Zend_Http_Client. On the server setup I have tor & privoxy running. Everything just worked fine, but now I need to make the code more scalable by running multiple instances of tor & privoxy on the same server.
Hence I shifted the Adapter for Zend from Zend_Http_Client_Adapter_Curl to Zend_Http_Client_Adapter_Proxy . But after changing the adapter I have bumped into a strange error which says - 400 Invalid header received from client and when I dump the object of Zend client, I see the following -
MyWebClientResponse::__set_state(array(
'json' => NULL,
'version' => '1.1',
'code' => 400,
'message' => 'Invalid header received from client',
'headers' =>
array (
'Proxy-agent' => 'Privoxy 3.0.19',
'Content-type' => 'text/plain',
'Connection' => 'close',
),
'body' => 'Invalid header received from client.
',
))
I do not understand what is that I am doing wrong. The code is done in Yii Framework so it is hard to share all the classes and Models, but I am sharing the main parts of the code, which are responsible for this -
$client = MyWebClient::factory();
$adapter = $client->getAdapter();
$adapter->setConfig(array('timeout' => 120,
'proxy_host' => 'localhost',
'proxy_port' => 8124
));
$client->setAdapter($adapter);
$client->setCookieJar(true);
$client->setParameterPost(array(
'name' => 'firstname',
'password' => 'password,
'login' => 'home'
));
$response = $client->setUri('https://www.domain.com/post.php')->requestApi('POST', false);
Here's the constructor of the class MyWebClient, just in case it it required, all other methods are standard.
static public function factory($new = false)
{
if (!isset(self::$client))
{
self::$client = new MyWebClient();
self::$client->setConfig(array(
'adapter' => 'Zend_Http_Client_Adapter_Proxy',
// 'proxy_host' => 'localhost',
// 'proxy_port' => 8118,
'persistent' => false,
'timeout' => 120
));
}
return self::$client;
}
The headers are being set in the requestAPI method and the snippet is -
$headers = array(
'X-PHX' => 'true',
'X-Requested-With' => 'XMLHttpRequest',
'Referer' => 'https://domain.com/index.html',
'User-Agent' => self::getRandomUserAgent()
);
$this->setHeaders($headers);
I would really appreciate help in this regard. Think it's privoxy which is not letting go the request out of the server.
Sachin
Looks like you may need to wrap the user-agent with quotes. Try this when setting the headers and see if that fixes the problem.
$headers = array(
'X-PHX' => 'true',
'X-Requested-With' => 'XMLHttpRequest',
'Referer' => 'https://domain.com/index.html',
'User-Agent' => "'".self::getRandomUserAgent()."'"
);
$this->setHeaders($headers);
Finally I decided to use the native cURL library and the cookieJar which comes with it. It worked like expected.
Cheers,
Sachin