I am trying to download a click performance report from the AdWords-API. For my example I am only selecting the Date field.
function DownloadCriteriaReportExample(AdWordsUser $user, $filePath) {
// Load the service, so that the required classes are available.
$user->LoadService('ReportDefinitionService', ADWORDS_VERSION);
// Create selector.
$selector = new Selector();
$selector->fields = array('Date');
// Filter out deleted criteria.
$selector->predicates[] = new Predicate('Status', 'NOT_IN', array('DELETED'));
// Create report definition.
$reportDefinition = new ReportDefinition();
$reportDefinition->selector = $selector;
$reportDefinition->reportName = 'Criteria performance report #' . uniqid();
$reportDefinition->dateRangeType = 'YESTERDAY';
$reportDefinition->reportType = 'CLICK_PERFORMANCE_REPORT';
$reportDefinition->downloadFormat = 'CSV';
// Exclude criteria that haven't recieved any impressions over the date range.
$reportDefinition->includeZeroImpressions = FALSE;
// Set additional options.
$options = array('version' => ADWORDS_VERSION, 'returnMoneyInMicros' => TRUE);
// Download report.
ReportUtils::DownloadReport($reportDefinition, $filePath, $user, $options);
printf("Report with name '%s' was downloaded to '%s'.\n",
$reportDefinition->reportName, $filePath);
}
The error I get ist: "ReportDefinitionError.INVALID_FIELD_NAME_FOR_REPORT".
The same script runs with no problems for the Criteria Performance Report.
https://developers.google.com/adwords/api/docs/appendix/reports#click
The issue is with your predicate - AS "click performance report" does not have a 'status' field - so remove that predicate - that is most likely your problem -
also remove the
$reportDefinition->includeZeroImpressions = FALSE;
You don't need this since this is a click performance report -
and the date field is a segment - if the above does not work , then maybe try to add at least an attribute , like GclId or something -
Since this report can only be run for one day , it seems silly to just be selecting the date.
Hope this helps -
See this link for reporting fields - if you plan on running a variety of reports you will find this link very useful
https://developers.google.com/adwords/api/docs/appendix/reports#click
Is there not a little bit more info after that error such as Trigger = 'status' or something? That will often tell you what column is causing the error.
If that does not help then run the GetReportFields.php file to see a list of the names and check they match the ones you are trying to use.
Also the names do change between versions so the example they show might only have names that work in the v201402 version and maybe you are trying the v201309 version. I had this issue and once I used the new library it was fixed.
From the documentation fro the Click Performance Report (https://developers.google.com/adwords/api/docs/appendix/reports/click-performance-report):
"Note: This report can be run only for a single day and only for dates up to 90 days before the time of the request."
So, I imagine you can't select the Date field, because it is implied by the single date you must filter by.
I know it's late, and you've probably moved on, but maybe this will help someone else with the same issue.
Related
I am currently facing a strange behaviour on shopware 6.
What I need is to get order's documents informations when the order is refunded (invoice number & creditNote number).
Here is how I am getting the documentEntity in the orderRepository:
$criteria = new Criteria([$orderId]);
$criteria->addAssociation('lineItems');
$criteria->addAssociation('documents');
$orderObject = $this->orderRepository->search($criteria, $context);
$documents = $orderObject->first()->getDocuments();
Normal behaviour
When the order state is set to "refunded_partially", $documents perfectly contains what it should.
The problem
When the order state is set to "refunded", $documents is empty and I have no errors in logs.
Maybe I overlooked but I saw no differences between the dump I made on $orderObject when it's "refunded" and when it's "refunded_partially".
Does someone have a clue on how to manage this correctly?
AS this one is pretty tricky to do, I recommand you to use a specific event to work with documents. The event is triggered when a document is created for an order (invoice, credit note, etc.)
Example here
My requirement is to restrict a content element with IP of a specific country (Eg: Austria). That means people visiting the website from Austrian IPs should be visible the content element and for all other users, it should be hidden. I am using geoip solution to check the country. I wrote a user function to implement this feature. I wrote a small extension and set hidden flag 1 and 0 based on countries. But due to TYPO3 caching, I want to clear the cache everytime to reflect the changes in frontend. I included the extension as USER_INT, and extension is non-cachable. But unfortunately not working. Functionality working, but due to caching changes not reflect in realtime.
$uid = 175; // uid of the content element needs to be hidden
$geoplugin = new \geoPlugin();
$geoplugin->locate();
$countryCode = $geoplugin->countryCode;
if( $countryCode == 'AT' ){
$GLOBALS['TYPO3_DB']->exec_UPDATEquery('tt_content', 'uid IN ('.$uid.')', array('hidden' => 0));
}else{
$GLOBALS['TYPO3_DB']->exec_UPDATEquery('tt_content', 'uid IN ('.$uid.')', array('hidden' => 1));
}
Is there any method available in TYPO3 to restrict content element for specific IP / Countries? or can you guys suggest a solution to fix this please?
The solution of Jost is much less dirty than hiding the element in the database depending on the visitors country. By your way the database probably changed on every user visit.
Just create a micro extension.
I am trying tot download a live feed of property listings from the CREA's DDF. I am making an API request via PHP to their DDF and am pulling and downloading all the recent listings into my DB. This works fine, the issue is my clients listings which are in the DDF and should be pulled with all the other listings are not being pulled. I seem to get what it seems like all the listings for the surrounding area, but maybe not since I can't receive my client's listings which should be a part of the pull. When I talked to CREA people they said my clients listings are in the DDF so I should be able to pull them with all the other listings. I was hoping to get some advice from some people who have a better understanding of making request like this, or even better using CREA's DDF.
I will provide my code below, I will try to only include stuff that would be relevant and take out the unnecessary code to make this a little easier. If you want to see more of the code, I will add those parts on request.
If I understand correctly I need to add onto my paramaters array but I really dont know why my request is doing this so any help would be awesome!
Here is the code for my download.php file
$TimeBackPull = "-24 hours";
/* RETS Variables */
require("PHRets_CREA.php");
$RETS = new PHRets();
$RETSURL = "http://data.crea.ca/Login.svc/Login";
$RETSUsername = "**********************";
$RETSPassword = "**********************";
$RETS->Connect($RETSURL, $RETSUsername, $RETSPassword);
$RETS->AddHeader("RETS-Version", "RETS/1.7.2");
$RETS->AddHeader('Accept', '/');
$RETS->SetParam('compression_enabled', true);
$RETS_PhotoSize = "LargePhoto";
$RETS_LimitPerQuery = 100;
if($debugMode /* DEBUG OUTPUT */)
{
$RETS->SetParam("catch_last_response", true);
$RETS->SetParam("debug_file", "CREA_Anthony.txt");
$RETS->SetParam("debug_mode", true);
}
$DBML = "(LastUpdated=" . date('Y-m-d', strtotime($TimeBackPull)) . ")";
$params = array("Limit" => 1, "Format" => "STANDARD-XML", "Count" => 1);
$results = $RETS->SearchQuery("Property", "Property", $DBML, $params);
$totalAvailable = $results["Count"];
for($i = 0; $i < ceil($totalAvailable / $RETS_LimitPerQuery); $i++)
{
$startOffset = $i*$RETS_LimitPerQuery;
$params = array("Limit" => $RETS_LimitPerQuery, "Format" => "STANDARD-XML", "Count" => 1, "Offset" => $startOffset);
$results = $RETS->SearchQuery("Property", "Property", $DBML, $params);
foreach($results["Properties"] as $listing)
{
//Do Some Stuff
}
}
Here is what my current request looks like in the return xml file
http://data.crea.ca/Search.svc/Search?SearchType=Property&Class=Property&Query=%28LastUpdated%3D2015-09-22%29&QueryType=DMQL2&Count=1&Format=STANDARD-XML&Limit=1&StandardNames=0
Another thing is that the CREA people said there should be around 1900 active listings to pull but when I count the results I only get around 182 right now
Did you get this resolved?
CREA DDF is weird animal. They only partially support the RETS spec, not fully. They've designed their system around "Destinations" (aka, "Data Feeds") and "Tech Provider" separations.
1) Each of your clients creates one or more data feeds, each of which is assigned a unique DestinationID. When setting up the feed, they select you as the Tech Provider, so their listings get included in your feed, too.
2) You, as the tech provider, have a single feed where you can pull all listings across all of your clients. Where this breaks down, though, is that each listing does not reference which feed/destination that it belongs to. You need to pull data in the context of a particular destination, and then manually associate the current DestinationID with the listings that come through.
CREA thinks that they've made it simpler by having a single Tech Provider feed, but they've actually made it more difficult because they are providing incomplete data on the responses. You, as the developer, need to manually do the associations at your end.
Are your clients selecting you as their Technology Provider during the setup of their Data Feeds? Do you see their feeds show in your Destinations table?
I made a Diary at http://kees.een-site-bouwen.nl/agenda which can have Events for specific dates. You can add an event using the form located here: http://kees.een-site-bouwen.nl/evenementform
As you can see I have 3 fields 'Datum, Van & Tot' (translation: Datum = date, Van = From, Tot = till)
If the time on that specific date expires I would like to run a script which deletes that specific row from the database.
I searched on google and found a few things like MYSQL Trigger and PHP cronjob, but I don't know if there's an easier way to do this? or how to do it exactly.
My database structure looks like this:
agenda // diary
- - - - // not added the whole structure.
idagenda
titel
waar
www
email
activated
....
....
agendadatum // diary dates
- - - - - -
id
datum
van
tot
idagenda
as you can see I'm using a join to add more dates to one event.
How could I trigger the datetime to delete the rows from the db if the date = today?
NOTE: I am using the Codeigniter framework.
You could set a hook. And use a function like
$this->db->where('tot <', date('Y-m-d H:i:s'));
$this->db->delete('agendadatum');
My codeigniter is a bit rusty but that query should remove all "old" entries on EVERY page load. So if you're going for high traffic this will not hold up.
Running a cronjob every hour/day is probably the "cleanest" way. This will require you to set a where condition on all selections of agendadatum that forces the "tot" date to be in the future. Else it's possible you see expired entries.
Edit:
From what I can gather if you define your hook like:
$hook['post_controller_constructor'] = array(
'class' => '',
'function' => 'delete_old_entries',
'filename' => 'agenda.php',
'filepath' => 'hooks',
'params' => array()
);
And create a file named agenda.php in application/hooks which looks like:
<?php
function delete_old_entries() {
$CI =& get_instance();
$CI->load->database();
$query = $CI->db->query(" DELETE FROM agendadatum WHERE tot < NOW(); ");
}
It should work. However this is pieced together from what I could find and untested. So it might work, it might not. But something along these lines should do the trick even if it isn't this.
If I understand correctly:
CREATE TRIGGER deleteRows AFTER UPDATE,INSERT ON myTable
FOR EACH ROW BEGIN
DELETE FROM myTable WHERE Datum = NOW()
END;
MySQL has a built-in Event Scheduler that basically allows you to run arbitrary SQL code at scheduled times. You could purge your database once a day for instance.
However, I know from your other question that you are hosting your project on a shared host, and, unfortunately, shuch hosts often disable the feature. Check out this manual page to find out whether the feature is active on your server.
Anyone know if/where there is documentation for valid ObjectList filter arrays?
The project's entry on github has a tiny blurb on it directing me to the API documentation, but that also fails to have a comprehensive list, and a search on 'filters' talks about containers only, not the object themselves.
I have a list of videos, each in four different formats named the same thing (sans filetype). Using the php-opencloud API, I want to GET only one of those video formats (to grab the unique filename rather than all its different formats).
I figured using a filter is the way to go, but I can't find any solid documentation.
Someone has got to have done this before. Help a noob out?
Most of the links on this page are dead now. Here's a current link to the php-opencloud documentation, which includes an example of using a prefix to filter the objectList results:
http://docs.php-opencloud.com/en/latest/services/object-store/objects.html#list-objects-in-a-container
I didn't find documentation of this, but apparently when the Rackspace Cloud Files documentation mentions arguments in a query string, those translate to arguments in an objectList method call like this:
GET /v1/MossoCloudFS_0672d7fa-9f85-4a81-a3ab-adb66a880123/AppleType?limit=2&marker=grannysmith
equals
$container->objectList(array('limit'=>'2', 'marker'=>'grannysmith'));
As Glen's pointed out, there isn't support (at the moment) for the service to apply filters on objects. The only thing which you might be interested in is supplying a prefix, which allows you to refine the objects returned based on how the filenames start. So if you sent 'bobcatscuddling' as the prefix, you'd get back all associated video formats for that one recording.
Your only option, it seems, is to get back all objects and iterate over the collection:
use OpenCloud\Rackspace;
$connection = new Rackspace(RACKSPACE_US, array(
'username' => 'foo',
'apiKey' => 'bar'
));
$service = $connection->objectStore('cloudFiles', 'DFW', 'publicURL');
$container = $service->container('CONTAINER_NAME');
$processedObjects = array();
$marker = '';
while ($marker !== null) {
$objects = $container->objectList('marker' => $marker);
$total = $objects->count();
$count = 0;
while ($object = $objects->next()) {
// Extract the filename
$filename = pathinfo($object->name, PATHINFO_FILENAME);
// Make sure you only deal with the filename once (i.e. to ignore different extensions)
if (!in_array($processedObjects, $filename)) {
// You can do your DB check here...
// Stock the array
$processedObjects[] = $filename;
}
$count++;
$marker = ($count == $total) ? $object->name : null;
}
}
What you'll notice is that you're incrementing the marker and making a new request for each 10,000 objects. I haven't tested this, but it'll probably lead you in the right direction.
Unfortunately, the underlying API doesn't support filtering for objects in Swift/Cloud Files containers (cf. http://docs.rackspace.com/files/api/v1/cf-devguide/content/List_Objects-d1e1284.html). The $filter parameter is supported as part of the shared code, but it doesn't actually do anything with Cloud Files here.
I'll see if I can get the docs updated to reflect that.