Google Cloud Translate API & Referer Restriction Issue - php

I have a frustrating issue with the Google Cloud Translate API.
I set up correctly the restriction of the key to some domains including *.example.com/ * (without blank space at the end)
I launch the script on the URL https://www.example.com/translate and i have the following message :
"status": "PERMISSION_DENIED",
"details": [
{
"#type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "API_KEY_HTTP_REFERRER_BLOCKED",
"domain": "googleapis.com",
When i remove the restriction, everything works, but i need the restriction to avoid misuse/abuse.
Furthemore, i use this same API Key for others Google App API (Maps, Auth, etc) and it works perfectly from this domain...
So weird.
Do you have any ideas or any ways to investigate better this issue ?
How i can know the referrer Google sees ? (or any external service)
Thanks a lot !!
Edit :
PHP code :
require_once(APPPATH . "libraries/GoogleTranslate/vendor/autoload.php");
require_once(APPPATH . "libraries/GoogleTranslate/vendor/google/cloud-translate/src/V2/TranslateClient.php");
$translate = new TranslateClient([
'key' => 'xXXXx'
]);
// Translate text from english to french.
$result = $translate->translate('Hello world!', [
'target' => 'fr'
]);
echo $result['text'];
Full error message :
Type: Google\Cloud\Core\Exception\ServiceException
Message: {
"error": { "code": 403, "message": "Requests from referer
\u003cempty\u003e are blocked.",
"errors": [ { "message": "Requests from referer \u003cempty\u003e are blocked.", "domain": "global", "reason": "forbidden" } ],
"status": "PERMISSION_DENIED",
"details": [ { "#type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "API_KEY_HTTP_REFERRER_BLOCKED",
"domain": "googleapis.com",
"metadata": { "service": "translate.googleapis.com", "consumer": "projects/XXXXX" } } ] } }
Filename: htdocs/application/libraries/GoogleTranslate/vendor/google/cloud-core/src/RequestWrapper.php
Line Number: 368

I will leave here my insights discussed on the Public Issue Tracker.
The HTTP restriction is working as intended, but the referer is always empty because this is not set by default. However, it can be added manually, so instead of doing:
-$translate = new TranslateClient([
'key' => 'XXX'
]);
You need to specify the referrer:
-$translate = new TranslateClient([
'key' => '[API_KEY]',
'restOptions' => [
'headers' => [
'referer' => '*.[URL].com/*'
]
]
]);
You have to take into account that this type of requests can be sent from whatever computer (if you have the key) since you’re not restricting the domain where the request is made, only checking who is the referrer (and you can set it manually). Moreover, API clients that run on a web browser expose their API keys publicly; that’s why I recommend you to use service accounts instead. For more information: adding application restrictions.
Regarding the HTTP referer, this is basically a header field that, basically, the web browsers put to let the web page know where the user is coming from. For example, if you click the above link (HTTP referer) your referer field will be this page.
In summary, since you can put whatever referer in the header of a request, this is pretty similar to not having any type of restrictions. Indeed, it’s recommended to use service accounts. To solve this issue easily, add the referer manually in the headers as exposed in the code above.

I read the comments and you seem to be doing everything ok. I would recommend you to try:
This error message can appear because you set API restrictions in the API key, is this the case? Maybe you’re restricting this specific API.
If you aren’t setting any API restrictions, is it possible to try adding an IP instead of the domain just for testing purposes?

I had same issue with google translate but not with maps.
So maps works with referrer restriction, but translate does not.
The only solution I found, with a restriction in force, is setting up an IP restriction instead of the HTTP referrers (web sites).

Related

how to set redirect_uri in instagram basic display on a insecure protocol

I'm trying to follow the documentation to get the instagram user profile info and feed using laravel. There in no way for me to set the redirect_uri parameter on http://localhost:8000/callback_address. If i do that, i get the message:
{
"error_type": "OAuthException",
"code": 400,
"error_message": "Invalid redirect_uri"
}
I tried to add it in the facebook developer console valid OAuth redirect URIs, but i get an error message which says that i should use HTTPS. I also tried to use socialite but i get the same error. The strange thing is that http redirect uri addresses are accepted for Facebook OAuth (by default, no need to write it in the developer console) but not for Instagram Basic Display.
This is the redirect statement which yelds the error above:
return redirect('https://api.instagram.com/oauth/authorize?client_id={my-client-id}&redirect_uri=https://localhost:8000/login/instagrambasic/callback&scope=user_profile,user_media&response_type=code')
I should use HTTP protocol in development, any idea? I've been stuck on this for a day, my eyes are bleeding.
You probably just need to urlencode() the redirect URI. Try something like this:
$query_data = array(
'client_id' => '{my-client-id}',
'redirect_uri' => 'https://localhost:8000/login/instagrambasic/callback',
'scope' => 'user_profile,user_media',
'response_type' => 'code'
);
$url = sprintf('https://api.instagram.com/oauth/authorize?%s', http_build_query($query_data);
return redirect($uri);
If you are using Socialite, this will be done for you, but make sure you use the socialiteproviders/instagram-basic package and not the older socialiteproviders/instagram package, as this was for v1 of the API which is no longer supported.

theiconic php-ga localhost vhost page views doesn't show up

Okay, so I setup a google analytics account for testing purposes. I have to work on some stuff and I'm testing things locally before pushing it to our live environment. Since the tracking has to be done if Javascript is off we use the php-ga-measurement-protocol library from theiconic.
This is the code I use:
$this->analytics = new Analytics(true, false);
$this->analytics->setClientId(filter_input(INPUT_COOKIE, 'gaClientId', FILTER_SANITIZE_STRING))
->setDebug(true)
->setDocumentHostName(getenv('HTTP_HOST'))
->setDocumentLocationUrl(getenv('HTTP_HOST') . getenv('REQUEST_URI'))
->setDocumentPath(getenv('REQUEST_URI'))
->setIpOverride(getenv('HTTP_CLIENT_IP') ?: getenv('HTTP_X_FORWARDED_FOR') ?: getenv('REMOTE_ADDR'))
->setProtocolVersion('1')
->setTrackingId({TRACKING_ID})
->setUserAgentOverride(getenv('HTTP_USER_AGENT'))
->setUserLanguage(strtolower(substr(getenv('HTTP_ACCEPT_LANGUAGE'), 0, 5)));
The URL generated would look something like:
https://www.google-analytics.com/debug/collect?cid=g9m2nds3980dki4ia2rcivtjn3&dh={WEBSITE.LOCAL}&dl={WEBSITE.LOCAL}%2F&dp=%2F&uip=127.0.0.1&v=1&tid=|||&ua=Mozilla%2F5.0%20%28Windows%20NT%206.3%3B%20Win64%3B%20x64%29%20AppleWebKit%2F537.36%20%28KHTML%2C%20like%20Gecko%29%20Chrome%2F74.0.3729.157%20Safari%2F537.36&ul=en-us&dt={DOCUMENT_TITLE}
This is the response:
{
"hitParsingResult": [ {
"valid": true,
"parserMessage": [ ],
"hit": "/debug/collect?cid=g9m2nds3980dki4ia2rcivtjn3\u0026dh={WEBSITE.LOCAL}\u0026dl={WEBSITE.LOCAL}%2F\u0026dp=%2F\u0026uip=127.0.0.1\u0026v=1\u0026tid=|||\u0026ua=Mozilla%2F5.0%20%28Windows%20NT%206.3%3B%20Win64%3B%20x64%29%20AppleWebKit%2F537.36%20%28KHTML%2C%20like%20Gecko%29%20Chrome%2F74.0.3729.157%20Safari%2F537.36\u0026ul=en-us\u0026dt={DOCUMENT_TITLE}?_anon_uip=127.0.0.0"
} ],
"parserMessage": [ {
"messageType": "INFO",
"description": "Found 1 hit in the request."
} ]
}
However, going to my Google Analytics page, it doesn't show up an active user and I can't figure out why.
Isn't that the intended consequence of sending the hit with debug mode "on"? To validate that the hit is correctly formed, but not to actually record the hit and impact reporting? From your own code:
->setDebug(true)
"hit": "/debug/collect?cid=g9m2nd..."
From the Google hit validation documentation on the /debug/collect endpoint:
Important: hits sent to the Measurement Protocol Validation Server will not show up in reports. They are for debugging only.

copyObject Access Denied on S3 Bucket despite policy allowing my site referrer

I am using AWS SDK for PHP to upload and display files to/from my S3 bucket.
The files should only be accessible through my site, no other referrer allowed - no hotlinking etc.
I also need to be able to copy objects within the bucket.
I create and connect as normal:
$s3Client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'eu-west-2'
]);
$s3 = $s3Client::factory(array(
'version' => 'latest',
'region' => 'eu-west-2',
'credentials' => array(
'provider' => $provider,
'key' => $key,
'secret' => $secret
)
));
And execute Copy object command:
$s3->copyObject([
'Bucket' => "{$targetBucket}",
'Key' => "{$targetKeyname}",
'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
]);
I have tried a policy using "Allow if string like referrer" but AWS then tells me I'm allowing public access?!?!?
Everything works just fine EVEN the copyObject action but files are still accessible directly and from everywhere!
I try using "Deny if string not like referrer" which works mostly as expected - I can upload and display the files and the files don't show when linked directly (which is what i want) - However, the copyObject action no longer works and i get the access denied error.
I've tried everything else i can think of and spent hours googling and searching for answers but to no avail.
Here's each seperate policy...
ALLOW FILE ACCESS ONLY (GetObject) if string LIKE referrer:
{
"Version": "2008-10-17",
"Id": "",
"Statement": [
{
"Sid": "Deny access if referer is not my site",
"Effect": "Deny",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": [
"arn:aws:s3:::MY-BUCKET/*"
],
"Condition": {
"StringLike": {
"aws:Referer": [
"http://MY-SITE/*",
"https://MY-SITE/*"
]
}
}
}
]
}
RESULT: uploads & copyObject work but files are still accessible everywhere
DENY ALL ACTIONS (*) if string NOT LIKE referrer:
{
"Version": "2008-10-17",
"Id": "",
"Statement": [
{
"Sid": "Deny access if referer is not my site",
"Effect": "Deny",
"Principal": {
"AWS": "*"
},
"Action": "s3:GetObject",
"Resource": [
"arn:aws:s3:::MY-BUCKET/*"
],
"Condition": {
"StringNotLike": {
"aws:Referer": [
"http://MY-SITE/*",
"https://MY-SITE/*"
]
}
}
}
]
}
RESULT: copyObject action no longer works and i get the access denied error
AWS is probably warning you that it's public because it is, for all practical purposes, still public.
Warning
This key should be used carefully: aws:referer allows Amazon S3 bucket owners to help prevent their content from being served up by unauthorized third-party sites to standard web browsers. [...] Since aws:referer value is provided by the caller in an HTTP header, unauthorized parties can use modified or custom browsers to provide any aws:referer value that they choose. As a result, aws:referer should not be used to prevent unauthorized parties from making direct AWS requests. It is offered only to allow customers to protect their digital content, stored in Amazon S3, from being referenced on unauthorized third-party sites.
https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html
If you are okay with this primitive mechanism, use it, but be aware that all it does is trust what the claim made by the browser.
And that's your problem with copyObject() -- the request is not being made by a browser so there is no Referer header to validate.
You can use the StringLikeIfExists condition test to Deny only wrong referers (ignoring the absence of a referer, as would occur with an object copy) or -- better -- just grant s3:GetObject with StringLike your referer, understanding that the public warning is correct -- this allows unauthenticated access, which is what referer checking amounts to. Your content is still publicly accessible but not from a standard, unmodified web browser if hotlinked from another site.
For better security, you will want to render your HTML with pre-signed URLs (with short expiration times) for your S3 assets, or otherwise do full and proper authorization using Amazon Cognito.
Objects in Amazon S3 are private by default. There is no access unless it is granted somehow (eg on an IAM User, IAM Group or an S3 bucket policy).
The above policies are all Deny policies, which can override an Allow policy. Therefore, they aren't the reason why something is accessible.
You should start by discovering what is granting access, and then remove that access. Once the objects are private again, you should create a Bucket Policy with Allow statements that define in what situations access is permitted.

YouTube Content ID API with Service Account returns Forbidden error

I've created a service account for use with the YouTube Content ID API, I'm following the steps under Set up your service account on:
https://developers.google.com/youtube/partner/guides/oauth2_for_service_accounts
The steps seem to be a bit outdated, but I'm managed to create a service account and I've got the email address. I didn't know if I needed to assign a Role so I didn't.
I visited the Content ID users page and invited the service account user:
The problem is I get the error the request could not be completed, but when I refresh the page, the user appears to be added successfully.
Well, when I'm making requests using this service account, I'm getting forbidden errors. For example the below error is when I'm trying to get a list of claims.
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "Forbidden"
}
],
"code": 403,
"message": "Forbidden"
}
}
But some requests work, like listing/inserting labels, listing content owners.
All the permissions are assigned for the user so I'm not sure what the reason for the forbidden errors are.
Here's the PHP code:
$OAUTH2_KEY_FILE = 'key.json';
$client = new \Google_Client();
$client->setAuthConfig($OAUTH2_KEY_FILE);
$client->setScopes([\Google_Service_YouTubePartner::YOUTUBEPARTNER]);
$youtubePartner = new \Google_Service_YouTubePartner($client);
dd($youtubePartner->claims->listClaims([
'onBehalfOfContentOwner' => 'XXX'
]));

GA not tracking certain actions

Our site is tracking CC and other transaction just fine, but has a problem with EFT transactions.
I have established that we are writing to the socket that is listening for GA calls with the correct information (compared it to another action that is being tracked correctly).
I need some help debugging the issue and was wondering if there was anywhere on the GA interface that I could see attempted and failed calls?
The moment you see it in the interface it's by definition not a failed call so, no, but there is a debugger extension that writes errors (if any) to the browser console.
If you do your tracking serverside (as you are talking about sockets) you could log the outgoing tracking calls to a file and send them to the debug endpoint by inserting the word "debug" after the hostname (so this would look something like google-analytics.com/debug/collect?v=1&tid=.....).
The debug endpoint will return a json response that points out missing or invalid fields. Example response below:
{
"hitParsingResult": [ {
"valid": false,
"parserMessage": [ {
"messageType": "ERROR",
"description": "The value provided for parameter 'tid' is invalid. Please see http://..... for details.",
"messageCode": "VALUE_INVALID",
"parameter": "tid"
} ],
"hit": "/debug/collect?v=1\u0026_v=j41d\u0026a=335525335\u0026t=pageview\u0026_s=1\u0026dl=file%3A%2F%2F%2FUsers%2Fepierstorff%2FDesktop%2Ftest.html\u0026dp=http%3A%2F%2F%2FUsers%2Fepierstorff%2FDesktop%2Ftest.html\u0026ul=de\u0026de=windows-1252\u0026dt=OFfline\u0026sd=24-bit\u0026sr=2560x1440\u0026vp=2385x678\u0026je=0\u0026fl=21.0%20r0\u0026_u=QEAAAIABI~\u0026jid=2113413999\u0026cid=761062822.1461745183\u0026tid=UA-XXXXXX-X\u0026_r=1\u0026z=140208380"
} ],
"parserMessage": [ {
"messageType": "INFO",
"description": "Found 1 hit in the request."
} ]
}

Categories