I am working on a web project that involves connecting to SharePoint Online via PHP and accessing the files stored on it. But I am extremely new to all this, and have hit a wall.
I have the URL of the file I'm trying to access
Using the phpSPO library, I am authenticated and connected to SharePoint.
The question is: how do I actually access the URL? If I follow the link directly, it redirects me to the login page for SharePoint. But we want the login to happen "behind the scenes" - and apparently the authentication step doesn't quite do that.
The company we are working with told us that we would need to request an anonymous link for the URL by calling a function. Problem is, the function they told us to use works in ASPX, but doesn't appear to be available in PHP.
This is the code they pointed us to:
Uri siteUri = new Uri(siteUrl);
Web web = context.Web;
SecureString passWord = new Secure String();
foreach (char c in "password".ToCharArray())
passWord.AppendChar(c);
context.Credentials = new SharePointOnlineCredentials("userid", passWord);
WebDocs.Parameter1 = "123456"
WebDocs.Parameter2 = "Test"
context.Web.CreateAnonymousLinkForDocument(WebDocs.Parameter1, WebDocs.Parameter2, ExternalSharingDocumentOption.View);
But how can I translate that into PHP? Can I even do that?
And if not, is there another way that I can access the file to display it to my user?
// this says the function CreateAnonymousLinkForDocument doesn't exist
function getLink(ClientContext $ctx) {
$anonymousLink = $ctx->getWeb()->CreateAnonymousLinkForDocument();
$ctx->load($anonymousLink);
$ctx->executeQuery();
}
Well, after hours and hours of searching the Internet....
The answer was right in front of my nose.
Started browsing through the examples/SharePoint/file_examples.php that came with the phpSPO library, and discovered 2 functions (either one works).
One is called downloadFile, and the other is downloadFileAsStream.
function downloadFile(ClientRuntimeContext $ctx, $fileUrl, $targetFilePath){
$fileContent =
Office365\PHP\Client\SharePoint\File::openBinary($ctx,$fileUrl);
file_put_contents($targetFilePath, $fileContent);
print "File {$fileUrl} has been downloaded successfully\r\n";
}
function downloadFileAsStream(ClientRuntimeContext $ctx, $fileUrl,
$targetFilePath) {
$fileUrl = rawurlencode($fileUrl);
$fp = fopen($targetFilePath, 'w+');
$url = $ctx->getServiceRootUrl() . "web/getfilebyserverrelativeurl('$fileUrl')/\$value";
$options = new \Office365\PHP\Client\Runtime\Utilities\RequestOptions($url);
$options->StreamHandle = $fp;
$ctx->executeQueryDirect($options);
fclose($fp);
print "File {$fileUrl} has been downloaded successfully\r\n";
}
Since I was trying to download a PDF, I just set these functions to create a PDF on our own server.... and it works beautifully!!!!!
Related
I am using the code example found at https://zatackcoder.com/upload-file-to-google-cloud-storage-using-php/ to upload a file to Google Cloud. Everything works fine (I am new to this so I'm just psyched to have it running!), but I recently read a blurb where you can tell Google Cloud what the md5 of the file you are uploading is, and Google will validate that against the actual upload and will fail the upload if there is a transmission problem resulting in a different md5.
My question is 2 part:
How to calculate the md5 in a format that Google Cloud will understand? I believe the calculation is actually the base64_encode() of the md5 returned from PHP's file_md5() but when I do this the resulting string is actually quite a bit different from the resulting "md5Hash": "TPmaCjp5uh1jxIQahhOAsQ==" that is returned from Google. I am using the Google Cloud STORAGE API to perform the upload (see link above and minor code snippet below).
Once I have the properly calculated md5Hash, how do I specify that in the upload() function to pass that value as part of the upload? Has anyone done this and can share their expertise?
Many thanks!
Here is the snippet from the larger project in the link above. The upload works fine, I am just looking for how to generate the appropriate md5Hash and to include that hash in the upload.
function uploadFile($bucketName, $fileContent, $cloudPath) {
$privateKeyFileContent = $GLOBALS['privateKeyFileContent'];
// connect to Google Cloud Storage using private key as authentication
try {
$storage = new StorageClient([
'keyFile' => json_decode($privateKeyFileContent, true)
]);
} catch (Exception $e) {
// maybe invalid private key ?
print $e;
return false;
}
// set which bucket to work in
$bucket = $storage->bucket($bucketName);
// upload/replace file
$storageObject = $bucket->upload(
$fileContent,
['name' => $cloudPath]
// if $cloudPath is existed then will be overwrite without confirmation
// NOTE:
// a. do not put prefix '/', '/' is a separate folder name !!
// b. private key MUST have 'storage.objects.delete' permission if want to replace file !
);
// is it succeed ?
return $storageObject != null;
}
According to the PHP client library for Cloud Storage documentation, in the upload method for the StorageObject object you can specify a metadata parameter.
This parameter is an array that allows you to pass the md5Hash. The full list of propierties that you can pass is stated here.
Am creating a REST api with laravel that allows a user to select an image file from their android device, then upload it to the server. The mage is converted to base64 before it's sent to the server alongside other parameters. I want to convert this base64 to a file and store it on the server then generate a link that can be used to access it. here is what i have tried so far and it doesnt work: I have already created a symlink to storage
public function create(Request $request)
{
$location = new Location();
$location->name = $request->location_name;
$location->latitude = $request->latitude;
$location->longitude = $request->longitude;
$location->saveOrFail();
$provider = new Provider();
$provider->name = $request->provider_name;
$provider->location_id = $location->id;
$provider->category_id = $request->category_id;
$provider->description = $request->description;
$provider->image = request()->file(base64_decode($request->encoded_image))->store('public/uploads');
$provider->saveOrFail();
return json_encode(array('status'=>'success', 'message'=>'Provider created successfully'));
}
As already commented by Amando, you can use the Intervention/Image package, having used it for many years I can say it will do what you want and very well.
What I would also add though, is you may also want to consider, whether you indeed need to store it as a file at all.
Depending on what it will be used for, and the size etc, you could consider storing it in the DB itself, along with any other information. This removes the dependency on a file server, and will make your application much more flexible with regards to infrastructure requirements.
At the end of the day, files are just data, if you will always get the file when you get the other data, reduce the steps and keep related data together.
Either way, hope you get it sorted :)
I am connecting to an API, and getting a report in a TSV format. I am needing to upload the report to Google BigQuery, but all the documentation I have found so far loads data from Google Cloud Storage. Is there a way to load data from a seperate URL?
Here is the code I have thus far:
$service = new Google_BigqueryService($client);
// Your project number, from the developers.google.com/console project you created
// when signing up for BigQuery
$project_number = '*******';
// Information about the destination table
$destination_table = new Google_TableReference();
$destination_table->setProjectId($project_number);
$destination_table->setDatasetId('php_test');
$destination_table->setTableId('my_new_table');
// Information about the schema for your new table
$schema_fields = array();
$schema_fields[0] = new Google_TableFieldSchema();
$schema_fields[0]->setName('Date');
$schema_fields[0]->setType('string');
$schema_fields[1] = new Google_TableFieldSchema();
$schema_fields[1]->setName('PartnerId');
$schema_fields[1]->setType('string');
....
$destination_table_schema = new Google_TableSchema();
$destination_table_schema->setFields($schema_fields);
// Set the load configuration, including source file(s) and schema
$load_configuration = new Google_JobConfigurationLoad();
$load_configuration->setSourceUris(array('EXTERNAL URL WITH TSV'));
$load_configuration->setDestinationTable($destination_table);
$load_configuration->setSchema($destination_table_schema);
$job_configuration = new Google_JobConfiguration();
$job_configuration->setLoad($load_configuration);
$load_job = new Google_Job();
$load_job->setKind('load');
$load_job->setConfiguration($job_configuration);
$jobs = $service->jobs;
$response = $jobs->insert($project_number, $load_job);
I realize that this is meant for Google Cloud Storage, but I do not want to use it, if I am just going to pass data through it and delete it within the hour.
Is there PHP code that I can use that will allow me to use external URLs and load data from them?
As Pentiuum10 mentioned above, BigQuery doesn't support reading from non-Google Cloud Storage URLs. The logistics involved would be tricky ... we'd need credentials to access the data, which we don't really want to have to be responsible for. If this is a popular request, we might end up supporting external paths that are either unrestricted or support oauth2. That said, we haven't had a lot of users asking for this so far.
Feel free to file a feature request via the public issue tracker here: https://code.google.com/p/google-bigquery/issues/.
I have an app that uploads user files to S3. At the moment, the ACL for the folders and files is set to private.
I have created a db table (called docs) that stores the following info:
id
user_id
file_name (original file as specified by the user)
hash_name (random hash used to save the file on amazon)
So, when a user wants to download a file, I first check in the db table that they have access to file. I'd prefer to not have the file first downloaded to my server and then sent to the user - I'd like them to be able to grab the file directly from Amazon.
Is it OK to rely on a very very long hashname (making it basically impossible for anyone to randomly guess a filename)? In this case, I can set the ACL for each file to public-read.
Or, are there other options that I can use to serve the files whilst keeping them private?
Remember, once the link is out there, nothing prevents a user from sharing that link with others. Then again, nothing prevents the user from saving the file elsewhere and sharing a link to the copy of the file.
The best approach depends on your specific needs.
Option 1 - Time Limited Download URL
If applicable to your scenario, you can also create expiring (time-limited) custom links to the S3 contents. That would allow the user to download content for a limited amount of time, after which they would have to obtain a new link.
http://docs.amazonwebservices.com/AmazonS3/latest/dev/S3_QSAuth.html
Option 2 - Obfuscated URL
If you value avoiding running the file through your web server over the risk that a URL, however obscure, might be intentionally shared, then use the hard-to-guess link name. This would allow a link to remain valid "forever", which means the link can be shared "forever".
Option 3 - Download through your server
If you are concerned about the link being shared and certainly want users to authenticate through your website, then serve the content through your website after verifying user credentials.
This option also allows the link to remain valid "forever" but require the user to log in (or perhaps just have an authentication cookie in the browser) to access the link.
I just want to post the PHP solution with code, if anybody has the same problem.
Here's the code I used:
$aws_access_key_id = 'AKIAIOSFODNN7EXAMPLE';
$aws_secret_key = 'YourSecretKey12345';
$aws_bucket = 'bucket';
$file_path = 'directory/image.jpg';
$timeout = '+10 minutes';
// get the URL!
$url = get_public_url($aws_access_key_id,$aws_secret_key,$aws_bucket,$file_path,$timeout);
// print the URL!
echo($url);
function get_public_url($keyID, $s3Key, $bucket, $filepath, $timeout)
{
$expires = strtotime($timeout);
$stringToSign = "GET\n\n\n{$expires}\n/{$aws_bucket}/{$file_path}";
$signature = urlencode(hex2b64(hmacsha1($s3Key, utf8_encode($stringToSign))));
$url = "https://{$bucket}.s3.amazonaws.com/{$file_path}?AWSAccessKeyId={$keyID}&Signature={$signature}&Expires={$expires}";
return $url;
}
function hmacsha1($key,$data)
{
$blocksize=64;
$hashfunc='sha1';
if (strlen($key)>$blocksize)
$key=pack('H*', $hashfunc($key));
$key=str_pad($key,$blocksize,chr(0x00));
$ipad=str_repeat(chr(0x36),$blocksize);
$opad=str_repeat(chr(0x5c),$blocksize);
$hmac = pack(
'H*',$hashfunc(
($key^$opad).pack(
'H*',$hashfunc(
($key^$ipad).$data
)
)
)
);
return bin2hex($hmac);
}
function hex2b64($str)
{
$raw = '';
for ($i=0; $i < strlen($str); $i+=2)
{
$raw .= chr(hexdec(substr($str, $i, 2)));
}
return base64_encode($raw);
}
I have somewhat of a knowledge of the PHP coding language and I would like to connect the Campaign Monitor API(Link) with my website, so that when the user enters something into the form on my site it will add it to the database on the Campaign Monitor servers. I found the PHP code example zip file, but it contains like 30 files, and I have no idea where to begin.
Does anyone know of a tutorial anywhere that explains how to connect to the API in a step-by-step manner? The code files by themselves include to much code that I may not need for simply connecting to the database and adding and deleting users, since I only want to give the user the power to add and delete users from the Mailing List.
This actually looks pretty straightforward. In order to use the API, you simply need to include() the CMBase.php file that is in that zip file.
Once you've included that file, you can create a CampaignMonitor object, and use it to access the API functions. I took this example out of one of the code files in there:
require_once('CMBase.php');
$api_key = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
$client_id = null;
$campaign_id = null;
$list_id = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
$cm = new CampaignMonitor( $api_key, $client_id, $campaign_id, $list_id );
//This is the actual call to the method, passing email address, name.
$result = $cm->subscriberAdd('joe#notarealdomain.com', 'Joe Smith');
You can check the result of the call like this (again taken from their code examples):
if($result['Result']['Code'] == 0)
echo 'Success';
else
echo 'Error : ' . $result['Result']['Message'];
Since you're only interested in adding a deleting users from a mailing list, I think the only two API calls you need to worry about are subscriberAdd() and subscriberUnsubscribe():
$result = $cm->subscriberAdd('joe#notarealdomain.com', 'Joe Smith');
$result = $cm->subscriberUnsubscribe('joe#notarealdomain.com');
Hope that helps. The example files that are included in that download are all singular examples of an individual API method call, and the files are named in a decent manner, so you should be able to look at any file for an example of the corresponding API method.