I`m using Symfony2 kpn snappy bundle to generate pdfs. I want to generate PDF from an html page with css. I found a solution, but it has a problem with :
$pageUrl = $this->generateUrl('accounts_management_generate_pdf_markup',
array('invoice' => $invoiceData), true); // use absolute path!
return new \Symfony\Component\HttpFoundation\Response(
$this->get('knp_snappy.pdf')->getOutput($pageUrl), 200, array(
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="file.pdf"'
)
);
The problem is that the pageUrl accounts_management_generate_pdf_markup is behind a security area and cannot be accessed without authentication. The generated file is just the login page, to which this path accounts_management_generate_pdf_markup redirects if not logged.
My questions are:
Is there any way to pass to snappy authentication credentials?
Is there another way using snappy bundle to generate the pdf using styles(css)
You can add the session cookie as an argument to the getOutput function:
$pageUrl = $this->generateUrl('route', array('id' => $id), true);
$session = $this->get('session');
$session->save();
session_write_close();
return new Response(
$this->get('knp_snappy.pdf')->getOutput($pageUrl, array('cookie' => array($session->getName() => $session->getId()))),
200,
array(
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="file.pdf"'
)
);
Related
I have a PHP file to upload images to Google cloud storage, but everytime an user changes their profile image, it shows the previous one for a long time, so they think the image is not updated at all. I would like to know what cache and metadata configuration is required to avoid this issue.
I would prefer not to use "custom metadata" because this message on the Cloud documentation https://cloud.google.com/storage/docs/metadata -> "Note that using custom metadata incurs storage and network costs."
This is my code:
require 'vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
$storage = new StorageClient([
'keyFilePath' => $keypath
]);
$storage->registerStreamWrapper();
$bucket = $storage->bucket($url);
$filename = $_FILES['profile_picture']['name'];
$newFileName = $_POST['user_id'].'_profile.jpg';
$bucket->upload(
fopen($_FILES["profile_picture"]["tmp_name"], 'r'),
[
'name' => $dir.$newFileName,
'metadata' => [
'cacheControl' => 'Cache-Control: no-cache, max-age=0',
]
]
);
I dont know if CacheControl is enough (and if the sintax is correct) or I need to include Custom-Time and what is the sintax for it.
My try:
$bucket->upload(
fopen($_FILES["profile_picture"]["tmp_name"], 'r'),
[
'name' => $dir.$newFileName,
'metadata' => [
'cacheControl' => 'Cache-Control: no-cache, max-age=0',
'Custom-Time' => date('Y-m-d\TH:i:s.00\Z');
]
]
);
The documentation dont have anything abour PHP syntax: https://cloud.google.com/storage/docs/metadata
Please help
I believe you should use:
$bucket->upload(
fopen($_FILES["profile_picture"]["tmp_name"], 'r'),
[
'name' => $dir.$newFileName,
'metadata' => [
'cacheControl' => 'no-cache, max-age=0',
]
]
);
I have noticed that you repeating the cacheControl in your code. Also, remember that:
If you allow caching, downloads may continue to receive older versions
of an object, even after uploading a newer version. This is because
the older version remains "fresh" in the cache for a period of time
determined by max-age. Additionally, because objects can be cached at
various places on the Internet, there is no way to force a cached
object to expire globally. If you want to prevent serving cached
versions of publicly readable objects, set Cache-Control:no-cache,
max-age=0 on the object.
for further read, refer to this documentation
My solution with the correct metadata syntax:
$bucket->upload(
fopen($_FILES["profile_picture"]["tmp_name"], 'r'),
[
'name' => $_SESSION['idcompany'].'/uploads/'.$newFileName,
'metadata' => [
'cacheControl' => 'no-cache, max-age=0',
'customTime' => gmdate('Y-m-d\TH:i:s.00\Z')
]
]
);
What I have till now
Right now I have a working oauth2 authentication between a laravel user and the dropbox API. Every user can upload files to their personal folder.
The Problem
After Uploading a file with laravel with the Dropbox API v2 I can see that there is a empty (0 Bytes) file uploaded.
Used to accomplish this task:
Laravel
Guzzle
Dropbox API Library
What am I missing?
The Code
My function for processing a form looks like this:
$formFile = $request->file('fileToUpload');
$path = $formFile->getClientOriginalName();
$file = $formFile->getPathName();
$result = Dropbox::files()->upload($path, $file);
return redirect('dropboxfiles');
And my files->upload function in my Dropbox Library looks like this:
$client = new Client;
$response = $client->post("https://content.dropboxapi.com/2/files/upload", [
'headers' => [
'Authorization' => 'Bearer '.$this->getAccessToken(),
'Content-Type' => 'application/octet-stream',
'Dropbox-API-Arg' => json_encode([
'path' => $path,
'mode' => 'add',
'autorename' => true,
'mute' => true,
'strict_conflict' => false
])
],
'data-binary' => '#'.$file
]);
The file, as I said, gets uploaded successfully. Correct name, but 0 Bytes. So empty file.
Thank you so much in advance for your help!
Update
With the following code I made it work. My question is though if there is a better "Laravel-Like" Solution instead of using fopen?
$response = $client->post("https://content.dropboxapi.com/2/files/upload", [
'headers' => [
'Authorization' => 'Bearer '.$this->getAccessToken(),
'Dropbox-API-Arg' => json_encode([
'path' => $path,
'mode' => 'add',
'autorename' => true,
'mute' => true,
'strict_conflict' => false
]),
'Content-Type' => 'application/octet-stream',
],
'body' => fopen($file, "r"),
]);
How #Greg mentioned (see cross-linking reference) I was able to solve this issue by using
'body' => fopen($file, "r"),
instead of
'data-binary' => '#'.$file
This is, how Greg mentioned, because data-binary is used in Curl requests. Other HTTP Clients, like Guzzle in my case use different names.
The Google drive api seems confusing when it comes to downloading files. I've been able to list all of the files and even downloaded few files but got stuck while downloading Google docs.
Why don't really they have common method for downloading a file but rather two different methods: Get and Export. When I don't need any conversion for the file, Why do I need to specify the Mime type for Export? Also, while listing files, the Mime type for all the files is always null. e.g
#foreach ($files->getFiles() as $file)
{{ $file->getMimeType() }} // empty string / null
#endforeach
Reference to getMimeType
Here's what I've done so far
public function listFiles(Request $request)
client = $this->getClient();
$service = new Google_Service_Drive($client);
$optParams = array(
'q' => 'fullText contains \'"abc"\'',
'pageSize' => 10,
'fields' => 'nextPageToken, files(id, name)'
);
$files = $service->files->listFiles($optParams);
}
public function downloadFile($file_id)
{
$client = $this->getClient();
$service = new Google_Service_Drive($client);
$file = null;
$file = $service->files->export($file_id, 'application/pdf', array(
'alt' => 'media'
)); // I really need to get rid of Mimetype(application/pdf), It should download the exact same file
}
Update
I have managed to solve the mime type problem by providing the field in the list.
$optParams = array(
'q' => 'fullText contains \'"basheer"\'',
'pageSize' => 10,
'fields' => 'nextPageToken, files(id, name, mimeType)' //<--
);
But now when I try to download load the file, it throws the following error
The requested conversion is not supported. "domain": "global", "reason": "badRequest",
// request('mimeType') = application/vnd.google-apps.spreadsheet
$file = $service->files->export($file_id, request('mimeType'), array(
'alt' => 'media'
));
Your answers and comments will be appreciated.
Thanks
Im using the following code:
$file = Storage::disk('s3')->getDriver()->readStream(attachmentPath().$attachment->filename);
return \Response::stream(function() use($file) {
fpassthru($file);
}, 200, [
'Content-Type' => $attachment->mimetype,
'Content-Description' => 'File Transfer',
'Content-Disposition' => 'attachment; filename=' . $attachment->filename,
'Content-Length' => $attachment->size
]);
The variables used respond to (with pdf or jpg as example):
"mimetype" (application/pdf or image/jpeg)
"filename" (ex: 240-ivlvei-pdf.pdf or 240-zi1gdv-ddvj63hxsaqn0az.jpg)
"size" in bytes
With PDF's, it works perfectly.
With images/gif it downloads damaged.
But can't figure out why. I assume is something to do with headers or something like that.
Any ideas?
Try using getObject method and see if it works (http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrieveObjSingleOpPHP.html)
$downloader = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => $object['Key'],
'SaveAs' => dirname(__FILE__)."/name_of_file.jpg"
));
I followed a guide on how to host your (premium) plugin on a server for auto-updates. Everything is working, but it's not secure at all. The link to the plugin's ZIP is public and anyone can download it.
Here is how the update.php file (on my server) looks like:
if (isset($_POST['action'])) {
switch ($_POST['action']) {
case 'version':
echo "3.1.1";
break;
case 'info':
$obj = new stdClass();
$obj->slug = '...';
$obj->plugin_name = '...';
$obj->new_version = "3.1.0";
$obj->requires = '4.7';
$obj->tested = '4.7.3';
$obj->downloaded = 12540;
$obj->last_updated = '2017-02-12';
$obj->homepage = '...';
$obj->sections = array(
'description' => '...'
);
$obj->download_link = 'https://.../latest.zip';
echo serialize($obj);
break;
case 'license':
echo 'false';
break;
}
} else {
header('Cache-Control: public');
header('Content-Description: File Transfer');
header('Content-Type: application/zip');
readfile('latest.zip');
}
The script will always return the .zip file, if no POST parameter (version, info or license) is provided.
All I want is to have a parameter that is sent to update.php, when WordPress requests the new .zip, just so I can authorize the download.
Even if anyone knows where this process is documented, that would help a lot as well.
This is how WordPress making POST request to its server when installing the new plugin.
array (
'method' => 'POST',
'timeout' => 15,
'redirection' => 5,
'httpversion' => '1.0',
'user-agent' => 'WordPress/4.7.3; http://example.com/',
'reject_unsafe_urls' => false,
'blocking' => true,
'headers' =>
array (
),
'cookies' =>
array (
),
'body' =>
array (
'action' => 'plugin_information',
'request' => 'O:8:"stdClass":4:{s:4:"slug";s:7:"jetpack";s:6:"fields";a:1:{s:8:"sections";b:0;}s:8:"per_page";i:24;s:6:"locale";s:5:"en_US";}',
),
'compress' => false,
'decompress' => true,
'sslverify' => true,
'sslcertificates' => '/var/www/html/wp2/wp-includes/certificates/ca-bundle.crt',
'stream' => false,
'filename' => NULL,
'limit_response_size' => NULL,
)
According to your question, you can verify users by his secure token(preferable) or email id
So whenever he install the plugin you need to create a secure token on your server and send it to your plugin then request plugin to store into the database(wp_option preferable) of WordPress after every calls to make to the server, You need to verify that token is valid else reject the request.
You need to change your code to verify the valid data coming on in POST rather than checking blindly POST data is available
You can send your token in header
Example : Authorization: Basic dm9yZGVsOnZvcmRlbA==