Openssl_pkcs7_sign(): error opening file - php

it's my first time doing signing of cert using openssl. Keep hitting the above error and tried realpath() and appending file:// but still can't get openssl to sign the profile. I don't understand how this works. Any insights would be appreciated.
Edit1: I'm not sure which file is the problematic one. The error messages wasn't specific enough. Is there a way to tell?
Code and screenshots below:
function signProfile()
{
$filename = "./template.mobileconfig";
$filename = realpath($filename);
$outFilename = $filename . ".tmp";
$pkey = dirname(__FILE__) . "/PteKey.key";
$pkey = realpath($pkey);
$certFile = dirname(__FILE__) . "/CertToSign.crt";
$certFile = realpath($certFile);
// try signing the plain XML profile
if (openssl_pkcs7_sign($filename, $outFilename, 'file://'.$certFile, array('file://'.$pkey, ""), array(), 0, ""))
{
// get the data back from the filesystem
$signedString = file_get_contents($outFilename);
// trim the fat
$trimmedString = preg_replace('/(.+\n)+\n/', '', $signedString, 1);
// convert to binary (DER)
$decodedString = base64_decode($trimmedString);
// write the file back to the filesystem (using the filename originally given)
$fh = fopen($filename, 'w');
fwrite($fh, $decodedString);
fclose($fh);
// delete the temporary file
unlink($outFilename);
return TRUE;
}
else
{
return FALSE;
}
}

Remove unwanted fields if not used in
openssl_pkcs7_sign($mobileConfig, $tmpMobileConfig, $certFile, array($pkey, ""), array());
Make sure file paths are correctly supplied.
require_once('variables.php'); //stores abs path to $tmpMobileConfig/$pteKeyPath/$CertToSignPath
$prepend = "file://";
$mobileConfig = realpath("./template.mobileconfig");
$pkey = $prepend . $pteKeyPath;
$pkey = str_replace('\\', '/', $pkey);
$certFile = $prepend . $CertToSignPath;
$certFile = str_replace('\\', '/', $certFile);
$isSignedCert = openssl_pkcs7_sign($mobileConfig, $tmpMobileConfig, $certFile, array($pkey, ""), array());

Related

Check for Valid/Active URLs before download - Laravel 7

I'm trying to check download images and store locally in my assets folder.
Before start downloading, I want to check and make sure the link is still live.
I only want to start my download if the link is live 200 Ok.
Try #1
public function handle()
{
$skills = Skill::all();
if($skills != null){
foreach($skills as $i=>$skill){
if (strpos($skill->img_path, 'http') !== false) {
if(!isset($exception)) {
//update the path in DB
$image_path = '/assets/fe/img/skill/';
$img_name = $skill->name.'.png';
$path = public_path() . $image_path . $img_name;
$uploadSuccess = file_put_contents($path, file_get_contents($skill->img_path));
// dd($uploadSuccess);
if($uploadSuccess) {
$skill->img_path = $image_path . $img_name;
}
}
}
$skill->save();
}
}
}
I seems to get so many issues
One of them is
curl: (6) Could not resolve host: thumbsplus.tutsplus.com
Another one is
file_get_contents(https://assets-cdn.github.com/images/modules/logos_page/Octocat.png): failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
What is the cleaner way ? Please suggest, I will try now.
Try #2
public function handle()
{
$skills = Skill::all();
if($skills != null){
foreach($skills as $i=>$skill){
if (strpos($skill->img_path, 'http') !== false) {
$file_headers = #get_headers($skill->img_path);
if(!$file_headers || $file_headers[0] == 'HTTP/1.1 404 Not Found') {
$exists = false;
}
else {
$exists = true;
if(!isset($exception)) {
//update the path in DB
$image_path = '/assets/fe/img/skill/';
$img_name = $skill->name.'.png';
$path = public_path() . $image_path . $img_name;
$uploadSuccess = file_put_contents($path, file_get_contents($skill->img_path));
// dd($uploadSuccess);
if($uploadSuccess) {
$skill->img_path = $image_path . $img_name;
}
}
}
}
$skill->save();
}
}
}
Try #3
public function handle()
{
$skills = Skill::all();
$failCount = 0;
$successCount = 0;
$failList = [];
if($skills != null){
foreach($skills as $i=>$skill){
if (strpos($skill->img_path, 'http') !== false) {
$file_headers = #get_headers($skill->img_path);
if(!$file_headers || strpos($file_headers[0], '404') !== false) {
$exists = false;
$failCount++;
array_push($failList,$skill->name);
// break;
}
else {
$exists = true;
$successCount++;
//DEBUG
// dd($file_headers[0]);
if(strpos($file_headers[0], '200')) {
//update the path in DB
$image_path = '/assets/fe/img/skill/';
$img_name = $skill->name.'.png';
$path = public_path() . $image_path . $img_name;
$uploadSuccess = file_put_contents($path, file_get_contents($skill->img_path));
// dd($uploadSuccess);
if($uploadSuccess) {
$skill->img_path = $image_path . $img_name;
}
}
}
}
$skill->save();
echo ".";
}
}
echo "\r\n";
$this->info('=========================');
$this->info('Success :'. $successCount);
$this->info('=========================');
$this->info('Fail :'. $failCount);
$this->info('List :'. print_r($failList));
$this->info('=========================');
}
seems to work
but it hang, sometimes more than 1 minute, at a certain dot
⚡️ php artisan skillIcons:download
..............................................................................................................................
=========================
Success :12
=========================
Fail :7
Array
(
[0] => GitHub
[1] => Geolocation API
[2] => Xcode
[3] => Protractor
[4] => Sketch
[5] => Amazon ECR
[6] => WinSCP
)
List :1
=========================
All images seems to be downloaded successfully
⚡️ ls public/assets/fe/img/skill/
AWS Console.png Digital Ocean.png Javascript.png PayPal.png Terminal.png
AWS.png Disqus.png Jest.png Photoshop.png TextMate.png
Alimofire.png Divvy.png Jira.png Pod.png TextWrangler.png
Amazon ECS.png Docker.png Kamar.png PostgreSQL.png Transmit.png
Amazon RDS.png Duet.png LESS.png PyCharm.png Twitter.png
Angular.png EC2.png Laravel Elixir.png Python.png Ubuntu.png
AngularJS.png Evernote .png Laravel.png QuickBooks.png VMWare Fusion .png
Apache.png Express.png Linode.png React Native.png VS Code.png
Atom.png Facebook.png Mac OS X.png Realm.png Vagrant.png
Bash.png Final Cut.png Markdown.png Redis.png Virtual Machine.png
BitBucket.png FusionCharts.png MobaXTerm.png RequireJS.png Virtualbox.png
Bower.png GitLab.png Mocha.png S3.png Webpack.png
CKEditor.png Go Daddy.png MySQL.png SAML 2.0.png Windows.png
CSS.png Google Chart.png NPM.png Salesforce.png Wireshark.png
Camtasia.png Google Map.png Navicat Premium.png Sass.png Word.png
Cent OS.png Google Translation.png Nginx.png Secure Shell.png Yarn.png
Chai.png Gulp.png Node.png Selenium.png iMovie.png
Chat.io.png HTML.png Noteability.png Shopify.png iOS.png
Coda.png Heroku.png OAuth 2.0.png SinonJS.png jQuery.png
CodeBox.png Illustrator.png Open Stack.png Siteground.png
Composer .png Instagram.png OpenID Connect.png Sublime Text.png
Confluence .png J Player.png PHP.png Swagger.png
3 seconds
How do I decrease the wait time to only 3 seconds ?
You have to handle error in some way.
You can try
try {
...
} catch () {
...
}
But I prefer doing things this way
public function handle()
{
Skill::get()->map(function($skill){
if(strpos($skill->img_path, 'http')) return;
$img = $this->getImageFromUrl($skill->img_path)
if(!$img == null) return;
$image_path = '/assets/fe/img/skill/';
$img_name = $skill->name.'.png';
$path = public_path() . $image_path . $img_name;
$fp = fopen($path,'x');
fwrite($fp, $img);
fclose($fp);
if($uploadSuccess) {
$skill->img_path = $image_path . $img_name;
$skill->save();
}
});
}
public function getImageFromUrl($url){
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0); // The number of seconds to wait while trying to connect. Use 0 to wait indefinitely.
curl_setopt($ch, CURLOPT_TIMEOUT, 2); // The maximum number of seconds to allow cURL functions to execute
$img = curl_exec($ch);
$err = curl_error($ch);
if($err) {
echo $err;
return null;
}
curl_close($ch);
return $img;
}
I refactored some code and if you prefer this keep it if not get only logic and do it as you prefer
Hope this helps
You may use active_url validation rule to check if the given URL is alive. According to the docs:
The field under validation must have a valid A or AAAA record according to the dns_get_record PHP function. The hostname of the provided URL is extracted using the parse_url PHP function before being passed to dns_get_record.
if (validator([$skill->img_path], ['active_url'])->fails()) {
// URL is not valid/active
}
else {
// URL is valid/active
}

Force download on GCS via App Engine using Signed URL

I get my file via:
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
$public_link = CloudStorageTools::getPublicUrl("gs://bucket/file.pdf", false);
If I go to $public_link in the browser, it shows the PDF inside the browser. I am trying to figure out how I can force the download of this file.
Google App Engine only has a 60 second timeout so I'm afraid the serve function wont work via GAE. Does anyone have any suggestions?
--
EDIT
Andrei Volga's previous answer in this post suggests I use a Signed URL with a response-content-distribution header.
So far, I am able to create a signed URL that successfully shows the file but I am not able to generate a signed url that has any sort of header at all aka create a signed URL that will force the download instead of just showing it.
This is what I have so far, most of which is courtesy of mloureiro.
function googleBuildConfigurationString($method, $expiration, $file, array $options = [])
{
$allowedMethods = ['GET', 'HEAD', 'PUT', 'DELETE'];
// initialize
$method = strtoupper($method);
$contentType = $options['Content_Type'];
$contentMd5 = $options['Content_MD5'] ? base64_encode($options['Content_MD5']) : '';
$headers = $options['Canonicalized_Extension_Headers'] ? $options['Canonicalized_Extension_Headers'] . PHP_EOL : '';
$file = $file ? $file : $options['Canonicalized_Resource'];
// validate
if(array_search($method, $allowedMethods) === false)
{
throw new RuntimeException("Method '{$method}' is not allowed");
}
if(!$expiration)
{
throw new RuntimeException("An expiration date should be provided.");
}
return <<<TXT
{$method}
{$contentMd5}
{$contentType}
{$expiration}
{$headers}{$file}
TXT;
}
function googleSignString($p12FilePath, $string)
{
$certs = [];
if (!openssl_pkcs12_read(file_get_contents($p12FilePath), $certs, 'notasecret'))
{
echo "Unable to parse the p12 file. OpenSSL error: " . openssl_error_string(); exit();
}
$RSAPrivateKey = openssl_pkey_get_private($certs["pkey"]);
$signed = '';
if(!openssl_sign( $string, $signed, $RSAPrivateKey, 'sha256' ))
{
error_log( 'openssl_sign failed!' );
$signed = 'failed';
}
else $signed = base64_encode($signed);
return $signed;
}
function googleBuildSignedUrl($serviceEmail, $file, $expiration, $signature)
{
return "http://storage.googleapis.com{$file}" . "?GoogleAccessId={$serviceEmail}" . "&Expires={$expiration}" . "&Signature=" . urlencode($signature);
}
$serviceEmail = '<EMAIL>';
$p12FilePath = '../../path/to/cert.p12';
$expiration = (new DateTime())->modify('+3hours')->getTimestamp();
$bucket = 'bucket';
$fileToGet = 'picture.jpg';
$file = "/{$bucket}/{$fileToGet}";
$string = googleBuildConfigurationString('GET', $expiration, $file, array("Canonicalized_Extension_Headers" => ''));
$signedString = googleSignString($p12FilePath, $string);
$signedUrl = googleBuildSignedUrl($serviceEmail, $file, $expiration, $signedString);
echo $signedUrl;
For small files you can use serve option instead of public URL with save-as option set to true. See documentation.
For large files you can use a Signed URL with response-content-disposition parameter.
You can add and additional query string only.
https://cloud.google.com/storage/docs/xml-api/reference-headers#responsecontentdisposition
response-content-disposition
A query string parameter that allows content-disposition to be overridden for authenticated GET requests.
Valid Values URL-encoded header to return instead of the content-disposition of the underlying object.
Example
?response-content-disposition=attachment%3B%20filename%3D%22foo%22

Looping through CSV and parsing a JSON query using each result

So despite hours of fiddling I cannot understand why my JSON query only returns a result for the last line in the CSV/TXT files I am trying to parse.
Here is the code:
//Enter API Key Here
$api_key = 'AIzaSyB9Dq3w1HCxkS5qyELI_pZuTmdK8itOBHo';
$origin = 'RG12 1AA';
$output_type = 'json'; //xml or json
$csv_location = 'http://www.naturedock.co.uk/postcodes.csv';
//Do not edit
$base_url = 'https://maps.googleapis.com/maps/api/directions/';
$origin_url = '?origin=';
$destination_url = '&destination=';
$end_url = '&sensor=false&key=';
$page = join("",file("$csv_location"));
$kw = explode("\n", $page);
for($i=0;$i<count($kw);$i++){
$destination = $kw[$i];
echo $destination;
$raw_url = $base_url . $output_type . $origin_url . $origin . $destination_url . $destination . $end_url . $api_key;
$request_url = str_replace(' ', '', $raw_url);
$getJson = file_get_contents($request_url);
$routes = json_decode($getJson);
$result = $routes->routes[0]->legs[0]->distance->value;
echo $result . '<br>';
}
The result I get looks like this:
Distance by Post Code Generator v0.1 by Phil Hughes
RG12 0GA
RG12 0GB
RG12 0GC
RG12 0GD4066
Where the '4066' is the correct variable for RG12 0GD postcode but none of the others return results as you can see.
Please help.
Your
join("",file("$csv_location"));
concatenated all lines feom the file to a single line without separator. The following explode() sees no newlines any more. So you are working on one line only. count($kw) always evaluates to 1 and your loop runs only one time.

Imagick - Can't read image files from URL.

I'm using this snippet for reading images on different websites:
$image = new Imagick('http://lp.hm.com/hmprod?set=key[source],value[/model/2012/P01 05156 06204 80 1175 4.jpg]&set=key[rotate],value[]&set=key[width],value[]&set=key[height],value[]&set=key[x],value[]&set=key[y],value[]&set=key[type],value[STILL_LIFE_FRONT]&call=url[file:/product/large]');
But sometimes, I get an error like this (about 20% of the time):
ImagickException
Unable to read the file: http://lp.hm.com/hmprod?set=key[source],value[/model/2012/P01 05156 06204 80 1175 4.jpg]&set=key[rotate],value[]&set=key[width],value[]&set=key[height],value[]&set=key[x],value[]&set=key[y],value[]&set=key[type],value[STILL_LIFE_FRONT]&call=url[file:/product/large]
Imagick->__construct()
The error seems to be consistent through this whole domain, but sometimes it's different from image to image on the same domain.
Questions
Why is this a problem?
How can we fix it?
Is there an alternative solution?
STEP 1 : GET URL
$url = 'http://blablabla.com/blabla.jpeg';
STEP 2 : Save blog content to a variable
$image = file_get_contents($url);
STEP 3 : Create Imagick object
$img = new Imagick();
STEP 4 : Instruct imagick object to read blob content of the image
$img -> readImageBlob($image);
STEP 5 : Save (or do operations) on image
$img -> writeImage('/var/www/html/uploads/img.jpg');
STEP 6 : Destroy imagick instance
$img -> destroy();
So I figured out I needed to encode the url properly. I'm not sure if this code is optimal, but it works.. and could hopefully help someone else.
$parsedUrl = parse_url('http://lp.hm.com/hmprod?set=key[source],value[/model/2012/P01 05156 06204 80 1175 4.jpg]&set=key[rotate],value[]&set=key[width],value[]&set=key[height],value[]&set=key[x],value[]&set=key[y],value[]&set=key[type],value[STILL_LIFE_FRONT]&call=url[file:/product/large]');
$info = pathinfo($parsedUrl['path']);
$dirname = explode('/', $info['dirname'] ?: '');
$dirname = array_filter($dirname, 'strlen');
$dirname = array_map('urlencode', $dirname);
$dirname = implode('/', $dirname);
$basename = urlencode($info['basename'] ?: '');
$path = array_filter(array($dirname, $basename), 'strlen');
$path = '/' . implode('/', $path);
$query = explode('&', $parsedUrl['query'] ?: '');
foreach ($query as &$set)
{
$set = explode('=', $set, 2);
$set = array_map('urlencode', $set);
$set = implode('=', $set);
}
$query = implode('&', $query);
$uri = array_filter(array($path, $query), 'strlen');
$uri = implode('?', $uri);
$fragment = urlencode($info['fragment'] ?: '');
$uri = array_filter(array($uri, $fragment), 'strlen');
$uri = implode('#', $uri);
$scheme = $parsedUrl['scheme'] ?: '';
$host = $parsedUrl['host'] ?: '';
$url = array_filter(array($scheme, $host), 'strlen');
$url = implode('://', $url);
$url .= $uri;
$image = new Imagick($url);
OBS!
This code will leave notifications.
Try to use urlencode function for encode special chars of url:
$image = new Imagick(urlencode('http://lp.hm.com/hmprod?set=key[source],value[/model/2012/P01 05156 06204 80 1175 4.jpg]&set=key[rotate],value[]&set=key[width],value[]&set=key[height],value[]&set=key[x],value[]&set=key[y],value[]&set=key[type],value[STILL_LIFE_FRONT]&call=url[file:/product/large]'));
Or if not work, try this:
$content = file_get_contents(urlencode('http://lp.hm.com/hmprod?set=key[source],value[/model/2012/P01 05156 06204 80 1175 4.jpg]&set=key[rotate],value[]&set=key[width],value[]&set=key[height],value[]&set=key[x],value[]&set=key[y],value[]&set=key[type],value[STILL_LIFE_FRONT]&call=url[file:/product/large]'));
$image = new Imagick($content);
I just encountered a similar issue. I was using file_get_contents() and was getting the same "ImagickException The filename is too long". The fix for me was finding the tmp directory and setting the correct permissions for it.

CloudFront URL signature works fine for RTMP, why won't it work for download URL?

I've written a PHP script that generates a signed CloudFront URL for RTMP with use in Flowplayer that's working just fine, but when I use the same signature generation method to create a download URL I get an AccessDenied XML file from Amazon. I've tried just about everything and I'm at my wits end. Anyone know why the signature would work for RTMP streaming, but that same signature generation method would fail for a download?
$keyPairId = 'APK...';
$privateKey = '/var/www/certs/pk-APK....pem';
$rtmp = false;
$distribution = 'd2m...';
// Get extension.
$extension = substr($this->getFilename(), strrpos($this->getFilename(), '.') + 1);
$fileName = substr($this->getFilename(), 0, strrpos($this->getFilename(), '.'));
$expires = strtotime(gmdate('Y-m-d H:i:s', strtotime('+3 hours')));
$json = '{"Statement":[{"Resource":"' . $fileName . '","Condition"{"DateLessThan":{"AWS:EpochTime":' . $expires . '}}}]}';
// read cloudfront private key pair
$fp = fopen($privateKey, 'r');
$priv_key = fread($fp, 8192);
fclose($fp);
// create the private key
$key = openssl_get_privatekey($priv_key);
// sign the policy with the private key
// depending on your php version you might have to use
// openssl_sign($json, $signed_policy, $key, OPENSSL_ALGO_SHA1)
openssl_sign($json, $signed_policy, $key);
openssl_free_key($key);
// create url safe signed policy
$base64_signed_policy = base64_encode($signed_policy);
$signature = str_replace(array('+', '=', '/'), array('-', '_', '~'), $base64_signed_policy);
// construct the url
$urlParams = urlencode($this->getFilename()) . '?Expires=' . $expires .'&Signature=' . $signature . '&Key-Pair-Id=' . $keyPairId;
$keyPairId;
if ($rtmp) {
$url = ( ($this->getExtension() != 'flv') ? $this->getExtension() . ':' : '' ) . $urlParams;
} else {
$url = 'https://' . $distribution . '.cloudfront.net/' . $urlParams;
}
First of all, signed RTMP urls are made differently than regular urls
RTMP distributions: Include only the stream name. For example, if the
full URL for a streaming video is:
rtmp://s5c39gqb8ow64r.cloudfront.net/videos/mp3_name.mp3
then use the following value for Resource:
videos/mp3_name
Regular signed urls contain the entire path.
Secondly, cloudfront RTMP distributions only distribute streaming media over RTMP. You said that you wanted a download url, so using an RTMP distribution will not enable you to download the file.
You probably want to create a cloudfront web distribution and link it to the same bucket, then generate a signed url using the web distribution, and access it that way.

Categories