I have a very old network box, that has PHP4 and a variation of LDAP installed on it.
Querying information isn't normally an issue when the returned results are small, but I'm trying to get all entries in one specific CN
This results in 'Warning: ldap_search() [function.ldap-search]: Partial search results returned: Sizelimit exceeded'
All the suggestions I've read have been for PHP5+ and Active directory, nothing for something this OLD.
I'm hoping you can help.
This is my very simple ldap query. $r is the connection.
$result = ldap_search($r, "cn=location", "(cn=*)");
$entry = ldap_first_entry($r, $result);
do {
$dn = ldap_get_dn($r, $entry);
echo "DN is $dn\n";
}
while ($entry = ldap_next_entry($r, $entry));
ldap_close($r);
This works until I hit the limit. I have tried changing the limit using LDAP_OPT_SIZELIMIT but obviously that didn't help.
What I'm wondering is.. is there any way to count the entries and then process them in smaller more manageable batches using something like :
$sr=ldap_list($r, "cn=location","cn>=".$last_location);
Is that possible ? any other ideas ?
Thanks
According to php.net/manual/...ldap-search, you should set the $sizelimit parameter to 0 to disable the memory limit. It will not, however, override the LDAP server configuration, if that is the cause of the error.
The function call will look something like the following:
ldap_search($r, "cn=location", "(cn=*)", null, 0, 0);
I'm not clear on how the filter parameter works, so you may need to change it if you get an error there.
See the above link for descriptions on all of the parameters for ldap_search().
Related
I've got SugarCrm plugin which is exporting data to external service. I'm using logic hooks for updated/deleted/new Contacts, but I've got problem with synchronizing already existing data. I have to extract all the data from the SugarCRM and there are two SugarBean methods I've tried to use: get_full_list() and get_list(). First one gives me the full Contact list, but I need to send it in batches 1000 Contacts in one Json max, the second method returns only first page of the Contacts (depends on config settings 10 - 1000max entries).
I'm using this method ATM:
// prepare contacts data from SugarBean
$bean = BeanFactory::getBean($module);
$contactResults = $bean->get_full_list();
Then foreach on $contactResults and save the data I want to the required format and send it as a Json via postrequest. I've tried to find the solution to split it into batches, but Im stuck :( Neither get_full_list or get_list seems to work for me.
Any suggestions? Maybe someone solved this issue already?
Thanks in advance!
It sounds to me like your problem is creating batches? If not please be more specific about what isn't working.
For splitting an array into batches, you may want to have a look at https://php.net/manual/en/function.array-chunk.php
Also get_list supports retrieving later pages. It is defined like this: function get_list($order_by = "", $where = "", $row_offset = 0, $limit=-1, $max=-1, $show_deleted = 0, $singleSelect=false, $select_fields = array()).
That means for the second page you could specify $row_offset = 1000, for the third page make it 2000, etc. So basically run a loop that calls get_list with $limit = 1000 and increases an initial $row_offset of 0 by 1000 after each iteration, until less than 1000 records or null is returned by the function.
Here are some general hints if you run into problems with processing those beans:
If the problem you're having is incomplete data, try loading each bean manually by using its ID. Some Sugar functions don't load all (special) fields by default.
If things seem to just fail for no reason, make sure to check your PHP log for errors. Maybe loading as many beans at once could possibly cause problems with your PHP's max_execution_time or memory_limit.
I've been searching for a suitable PHP caching method for MSSQL results.
Most of the examples I can find suggest storing the results in an array, which would then get included to page. This seems great unless a request for the content was made at the same time as it being updated/rebuilt.
I was hoping to find something similar to ASP's application level variables, but far as I'm aware, PHP doesn't offer this functionality?
The problem I'm facing is I need to perform 6 queries on page to populate dropdown boxes. This happens on the vast majority of pages. It's also not an option to combine the queries. The cached data will also need to be rebuilt sporadically, when the system changes. This could be once a day, once a week or a month. Any advice will be greatly received, thanks!
You can use Redis server and phpredis PHP extension to cache results fetched from database:
$redis = new Redis();
$redis->connect('/tmp/redis.sock');
$sql = "SELECT something FROM sometable WHERE condition";
$sql_hash = md5($sql);
$redis_key = "dbcache:${sql_hash}";
$ttl = 3600; // values expire in 1 hour
if ($result = $redis->get($redis_key)) {
$result = json_decode($result, true);
} else {
$result = Db::fetchArray($sql);
$redis->setex($redis_key, $ttl, json_encode($result));
}
(Error checks skipped for clarity)
I am trying to run a query in BigQuery/PHP (using google php SDK) that returns a large dataset (can be 100,000 - 10,000,000 rows).
$bigqueryService = new Google_BigqueryService($client);
$query = new Google_QueryRequest();
$query->setQuery(...);
$jobs = $bigqueryService->jobs;
$response = $jobs->query($project_id, $query);
//query is a syncronous function that returns a full dataset
The next step is to allow the user to download the result as a CSV file.
The code above will fail when the dataset becomes too large (memory limit).
What are my options to perform this operation with lower memory usage ?
(I figured an option is to save the results to another table with BigQuery and then start doing partial fetch with LIMIT and OFFSET but I figured a better solution might be available..)
Thanks for the help
You can export your data directly from Bigquery
https://developers.google.com/bigquery/exporting-data-from-bigquery
You can use PHP to run a API call that does the export (you dont need the BQ tool)
You need to set the jobs configuration.extract.destinationFormat see the reference
Just to elaborate on Pentium10's answer
You can export up to a 1GB file in json format.
Then you can read the file line by line which will minimize the memory used by your application and then you can use json_decode the information.
The suggestion to export is a good one, I just wanted to mention there is another way.
The query API you are calling (jobs.query()) does not return the full dataset; it just returns a page of data, which is the first 2 MB of the results. You can set the maxResults flag (described here) to limit this to a certain number of rows.
If you get back fewer rows than are in the table, you will get a pageToken field in the response. You can then fetch the remainder with the jobs.getQueryResults() API by providing the job ID (also in the query response) and the page token. This will continue to return new rows and a new page token until you get to the end of your table.
The example here shows code (in java in python) to run a query and fetch the results page by page.
There is also an option in the API to convert directly to CSV by specifying alt='csv' in the URL query string, but I'm not sure how to do this in PHP.
I am not sure do you still using the PHP but the answer is:
$options = [
'maxResults' => 1000,
'startIndex' => 0
];
$jobConfig = $bigQuery->query($query);
$queryResults = $bigQuery->runQuery($jobConfig, $options);
foreach ($queryResults as $row) {
// Handle rows
}
I am currently working on a PHP script that will be polling Active Directory to pick out modified objects (people/users), via LDAP.
I'm able to filter on uSNChanged when I have the value, like so:
$previousUsn = '1234';
$ldapCon = ldap_connect('ldap-host');
$ldapBind = ldap_bind($ldapCon, 'ldap-user', 'ldap-password');
$sr = ldap_search($ldapCon, "ou=Users,dc=foo", "uSNChanged >= $previousUsn");
According to this, I should be able to retrieve a highestCommittedUSN attribute that could be used for the initial run of the script. I've been looking around to find out how this can be done using PHP & LDAP, but to no avail.
Alternatively, feel free to suggest completely different methods of retrieving changes in AD.
ldap_read(...) seems to do the trick:
function getHighestCommittedUsn() {
ldap_bind(...);
$sr = ldap_read($ldapCon, null, "(highestcommittedusn=*)", array("highestcommittedusn"));
$rs = ldap_get_entries($ldapCon, $sr);
return $rs[0]["highestcommittedusn"][0];
}
Try setting the search base (second argument) in your ldap_search call to "". That attribute is on a pseudo object called RootDSE.
Perhaps if the biffins function hasn't been initialised correctly? Hmmm try changing the variable type of c_willygham
Greetings,
I already have a working connection to the AD and can search and retrieve information from it. I've even developed a recursive method by which one can retrieve all groups for a given user. However, I'd like to avoid the recursion if possible. One way to do this is to get the tokenGroups attribute from the AD for the user, which should be a list of the SIDs for the groups that the specified user has membership, whether that membership be direct or indirect.
When I run a search for a user's AD information, though, the tokenGroups attribute isn't even in it. I tried specifically requesting that information (i.e., specifying it using the fourth parameter to ldap_search) but that didn't work, either.
Thanks,
David Kees
Solved my own problem and thought I'd put the answer here so that others might find it. The issue was using the ldap_search() function. The answer was to use the ldap_read() function instead of ldap_search(). The difference is the scope of the request. The search function uses a scope of "sub" (i.e., subtree) while the read function uses "base." The tokenGroups information can only be found when using a scope of "base" so using the correct PHP function was the key.
As I mentioned above, I was working from someone else code in perl to create my solution and the perl script used a function named "search" to do it's LDAP requests which lead me down wrong path.
Thanks to those who took a peek at the question!
--
As per the requests in the comments, here's the basics of the solution in code. I'm extracting from an object that I use so this might not be 100% but it'll be close. Also, variables not declared in this snipped (e.g. $server, $user, $password) are for you to figure out; I won't know your AD credentials anyway!
$ldap = ldap_connect($server);
ldap_bind($ldap, $user, $password);
$tokengroups = ldap_read($ldap, $dn, "CN=*", array("tokengroups")));
$tokengroups = ldap_get_entries($ldap, $tokengroups);
At this point, $tokengroups is our results as an array. it should have count index as well as some other information. To extract the actual groups, you'll need to do something like this:
$groups = array();
if($tokengroups["count"] > 0) {
$groups = $tokengroups[0]["tokengroups"];
unset($groups["count"]);
// if you want the SID's for your groups, you can stop here.
// if you want to decode the SID's then you can do something like this.
// the sid_decode() here: http://www.php.net/manual/en/function.unpack.php#72591
foreach($groups as $i => &$sid) {
$sid = sid_decode($sid);
$sid_dn = ldap_read($ldap, "<SID=$sid>", "CN=*", array("dn"));
if($sid_dn !== false) {
$group = ldap_get_entries($ldap, $sid_dn);
$group = $group["count"] == 1 ? $group[0]["dn"] : NULL;
$groups[$i] = $group;
}
}
}
That's the basics. There's one caveat: you'll probably need to work with the individual or individuals who manage AD accounts at your organization. The first time I tried to get this running (a few years ago, so my memory is somewhat fuzzy) the account that I was given did not have the appropriate authorization to access the token groups information. I'm sure there are other ways to do this, but because I was porting someone else's code for this specific solution, this was how I did it.