I have an application that generates an array of statistics based on a greyhounds racing history. This array is then used to generate a table which is then output to the browser. I am currently working on creating a function that will generate an excel download based on these statistics. However, this excel download will only be available after the original processing has been completed. Let me explain.
The user clicks on a race name
The data for that race is then processed and displayed in a table.
Underneath the table is a link for an excel download.
However, this is where I get stuck. The excel download exists within another method within the same controller like so...
function view($race_id) {
//Process race data and place in $stats
//Output table & excel link
}
function view_excel($race_id) {
//Process race data <- I don't want it to have to process all over again!
//Output excel sheet
}
As you can see, the data has already been processed in the "view" method so it seems like a massive waste of resources having it processed again in the "view_excel" method.
Therefore, I need a method of transferring $stats over to the excel method when the link is clicked to prevent it having to be reproduced. The only methods I can think of are as follows.
Transferring $stats over to the excel method using a session flash
The variable may end up being too big for a session variable. Also, if for some reason the excel method is refreshed, the variable will be lost.
Transferring $stats over to the excel method using an ordinary session variable
As above, the variable may end up being too big for a session variable. This has the benefit that it wont be lost on a page refresh but I'm not sure how I would go about destroying old session variables, especially if the user it processing alot of races in a short period of time.
Storing $stats in a database and retrieving it in the excel method
This seems like the most viable method. However, it seems like a lot of effort to just transfer one variable across. Also, I would have to implement some sort of cron job to remove old database entries.
An example of $stats:
Array
(
[1] => Array
(
[fcalc7] =>
[avgcalc7] =>
[avgcalc3] => 86.15
[sumpos7] =>
[sumpos3] => 9
[sumfin7] =>
[sumfin3] => 8
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 4
[total_races] => 5
)
[2] => Array
(
[fcalc7] => 28.58
[avgcalc7] => 16.41
[avgcalc3] => 28.70
[sumpos7] => 18
[sumpos3] => 5
[sumfin7] => 23
[sumfin3] => 7
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 7
[total_races] => 46
)
[3] => Array
(
[fcalc7] => 28.47
[avgcalc7] => 16.42
[avgcalc3] => 28.78
[sumpos7] => 28
[sumpos3] => 11
[sumfin7] => 21
[sumfin3] => 10
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 7
[total_races] => 63
)
)
Would be great to hear your ideas.
Dan
You could serialize the array into a file in sys_get_temp_dir() with a data-dependet file name. The only problem left is cleaning up old files.
Putting it into the database is also possible as you said, and deleting old data is easier than on the file system if you track the creation time.
Related
I am working with CakePHP 1.3.13. Here I have writen a code to insert form in to database.
Here, deals database table looks like below.
When I insert record into database so voucher_code column is not inserted.
Here when I print $this->data then it will gives all data like :
Array
(
[Deal] => Array
(
[title] => Deal title
[original_price] => 350
[discount] => 45
[total_price] => 192.5
[voucher_code] => TEST3211
[redeem_points] => 158
[deal_details] => tetert
[condition] => Testing
[deal_address] => tertre
[deal_end_date] => 2016-05-26
[no_of_deals] => 10
[merchant_id] => 24
[image] => 146399768856085.jpg
)
)
Here I have write insert query like :
$this->Deal->create();
$this->Deal->save($this->data);
So all column's are inserted except voucher_code. So what will be the error ? and How can I resolve this error?
Only those columns/fields will be saved that are present in the cached database table schema, so when adding fields after CakePHP has already cached it, you'll have to clear the cache (delete app/tmp/cache/models) in order for the new columns to be recognized.
Modifying
app/Config/core.php
Config::write('debug',2);
Refreshing a page and restoring
Config::write('debug');
to original value will also work.
this is very common issue goto tmp folder of your project and delete all the cache under model and persistent folder and re run your query and it start works.
I'm integrating a paypal payment method on my website, I got it all running just fine, I'm stuck at the point were paypal sends me to my return URL with information about the customer and items purchased.
I get this following structure on the confirmation array
Array
(
some customer info
...
[L_NAME0] => Frame%20Rojo
[L_NAME1] => External%20Hard%20Disk
[L_NUMBER0] => PD1002
[L_NUMBER1] => PD1003
[L_QTY0] => 1
[L_QTY1] => 1
[L_TAXAMT0] => 0%2e00
[L_TAXAMT1] => 0%2e00
[L_AMT0] => 29%2e00
[L_AMT1] => 100%2e00
...
)
What I'm interested is in saving the whole item list, quantities and prices to my database so I can later keep track of what's been sent and what not.
My issue here is that as you can see, paypal returns to me a set of values that are names "something+n" (L_NUMBER0 and so on), so, I can't just set up a table on my DDBB as I don't know how many items would an user get. I could save it on 2 tables: purchase and items_per_purchase like structure, but I still face the issue of parsing that array.
What would be the best way to run through it and see how many items per purchase there are to save?
I thought of some kind of bucle wich sees:
while(if(isset($_GET['L_NUMBER'.$cont]))) {
// save to ddbb
L_NAME.$cont
L_NUMBER.$cont
...
cont++
}
... and increment some counter but I would like to know if there's a better solution.
I think your solution is fine, though you don't need an if inside the while test...
$cont = 0;
while (isset($_GET['L_NUMBER' . $cont])) {
// save to database after assembling array keys as follows...
// L_NAME . $cont
// L_NUMBER . $cont
// etc.
cont++;
}
And you should never trust input from $_GET. I recommend using parameterized queries with PDO.
Summary:
Google_Service_Calendar seems to be "force-paginating" results of $service->events->listEvents();
Background:
google calendar API v3,
using php client library
We are developing a mechanism to sync our internal calendar with a user's google calendar.
Please note I will refer below to $x, which represents google's default limit on the number of events, similar to $options['maxResults']; The default value is 250, but it should not matter: we have tested the below with and without explicitly defined request parameters such as 'maxResults', 'timeMin', and 'timeMax' - the problem occurs in all cases.
Another relevant test we did: export this calendar to foobar.ics, created a new gmail user form scratch, import foobar.ics to newuser#gmail.com. DOES NOT REPLICATE THIS ISSUE. We have reviewed/reset various options in the subject calendar (sharing, etc) but cannot find any setting that has any effect.
The Problem:
Normally, when we call this:
$calendar='primary';
$optParams=array();
$events = $this->service->events->listEvents($calendar, $optParams);
$events comes back as a Google_Service_Calendar_Events object, containing $n "items". IF there are more than $x items, the results could be paginated, but the vanilla response (for a 'normal', result set with ( count($items) < $x ) ) is a single object, and $events->nextPageToken should be empty.
One account we are working with (of course, the boss's personal account) does not behave this way. The result of:
$events = $this->service->events->listEvents('primary', []);
is a Google_Service_Calendar_Events object like this:
Google_Service_Calendar_Events Object
(
[accessRole] => owner
[defaultRemindersType:protected] => Google_Service_Calendar_EventReminder
[defaultRemindersDataType:protected] => array
[description] =>
[etag] => "-kakMffdIzB99fTAlD9HooLp8eo/WiDS9OZS7i25CVZYoK2ZLLwG7bM"
[itemsType:protected] => Google_Service_Calendar_Event
[itemsDataType:protected] => array
[kind] => calendar#events
[nextPageToken] => CigKGmw5dGh1Mms4aWliMDNhanRvcWFzY3Y1ZGkwGAEggICA6-L3lrgUGg0IABIAGLig_Zfi278C
[nextSyncToken] =>
[summary] => example#mydomain.com
[timeZone] => America/New_York
[updated] => 2014-07-23T15:38:50.195Z
[collection_key:protected] => items
[modelData:protected] => Array
(
[defaultReminders] => Array
(
[0] => Array
(
[method] => popup
[minutes] => 30
)
)
[items] => Array
(
)
)
[processed:protected] => Array
(
)
)
Notice that $event['items'] is empty, and nextPageToken is not null. If we then do a paginated request like this:
while (true) {
$pageToken = $events->getNextPageToken();
if ($pageToken) {
$optParams = array('pageToken' => $pageToken);
$events = $this->service->events->listEvents($calendar, $optParams);
if(count($events) > 0){
h2("Google Service returned total of ".count($events)." events.");
}
} else {
break;
}
}
The next result set gives us the events. In other words, the google service seems to be paginating the initial results, despite the fact that we are certain the result is less than $x.
To be clear, if we have 5 events on our calendar, we expect 1 result with 5 items. Instead, we get 1 result with 0 items, but the first result of the 'nextPageToken' logic gives us the desired 5 items.
Solution Ideas?:
A. handle paginated results, and/or "Incremental Syncronization'. These are on our list of features to implement, but we consider these to be more 'optimization' than 'necessity'. In other words, I understand that handling/sending nextSyncToken and nextPageToken are OPTIONAL- thus the issue we are having should not depend on our client code doing this.
B. use a different, non-primary calendar for this user. we think this particular primary calendar may corrupt or somehow cached on google's side: to be fair, we did at one point accidentally insert a bunch of junk events on this calendar to the point that google put us in read-only mode as described here: https://support.google.com/a/answer/2905486?hl=en but we understand that was a temporary result of clunky testing.... In other words, we know we HAD screwed this calendar up badly, but this morning we deleted ALL events, added a single test event, and got the same result as above FOR THIS CALENDAR. Cannot replicate for any other user.... including a brand new gmail user.
C. delete the 'primary' calendar, create a new one. Unfortunately, we understand it is not possible to delete the primary CALENDAR, only to delete the CALENDAR EVENTS.
D. make the boss open a brand new google account
Any other suggestions? We are proceeding with A, but even that is a band-aid to the problem, and does not answer WHY is this happening? How can we avoid it in the future? (Please don't say "D")
Thanks in advance for any advice or input!
There is a maximum page size, if you don't specify one yourself there is an implicit one (https://developers.google.com/google-apps/calendar/v3/pagination). Given this it's necessary to implement pagination for your app to work correctly.
As you noticed, a page does not always contain the maximum number of results so pagination is important even if the number of events does not exceed the page size. Just keep following the page tokens and it will eventually give you all the results (there will be a page with no nextPageToken)
TL;DR A :)
I'm doing some integrations towards MS based web applications which forces me to fetch the data to my php application via SOAP which is fine.
I got the structure of a file system in an xml which I convert to an object. All documents have an ID and it's path. To be able to place the documents in a tree view I've built some methods to calculate the documents whereabouts through the files and folder structure. This works fine until I started to try with large file lists.
What I need is a faster method (or way to do things) than a foreach loop.
The method below is the troublemaker.
/**
* Find parent id based on path
* #param array $documents
* #param string $parentPath
* #return int
*/
private function getParentId($documents, $parentPath) {
$parentId = 0;
foreach ($documents as $document) {
if ($parentPath == $document->ServerUrl) {
$parentId = $document->ID;
break;
}
}
return $parentId;
}
// With 20 documents nested in different folders this method renders in 0.00033712387084961
// With 9000 documents nested in different folders it takes 60 seconds
The array sent to the object looks like this
Array
(
[0] => testprojectDocumentLibraryObject Object
(
[ParentID] => 0
[Level] => 1
[ParentPath] => /Shared Documents
[ID] => 163
[GUID] => 505d70ea-51d7-4ef0-bf79-8e912553249e
[DocIcon] =>
[FileType] =>
[Title] => Folder1
[BaseName] => Folder1
[LinkFilename] => Folder1
[ContentType] => Folder
[FileSizeDisplay] =>
[_UIVersionString] => 1.0
[ServerUrl] => /Shared Documents/Folder1
[EncodedAbsUrl] => http://dev1.example.com/Shared%20Documents/Folder1
[Created] => 2011-10-08 20:57:47
[Modified] => 2011-10-08 20:57:47
[ModifiedBy] =>
[CreatedBy] =>
[_ModerationStatus] => 0
[WorkflowVersion] => 1
)
...
A bit bigger example of the data array is available here
http://www.trikks.com/files/testprojectDocumentLibraryObject.txt
Thanks for any help!
=== UPDATE ===
To illustrate the time different stuff takes I've added this part.
Packet downloaded in 8.5031080245972 seconds
Packet decoded in 1.2838368415833 seconds
Packet unpacked in 0.051079988479614 seconds
List data organized in 3.8216209411621 seconds
Standard properties filled in 0.46236896514893 seconds
Custom properties filled in 40.856066942215 seconds
TOTAL: This page was created in 55.231353998184 seconds!
Now, this is a custom property action that im describing, the other stuff is already somewhat optimized. The data sent from the WCF service is compressed and encoded ratio 10:1 (like 10mb uncompressed : 1mb compressed).
The current priority is to optimize the custom properties part, where the getParentId method takes 99% of the execution time!
You may see faster results by using XMLReader or expat instead of simplexml. Both of these reqd the xml sequentially and won't store the entire document in memory.
Also make sure you have the APC extension on, for the actual loop it's a big big difference. Some benchmarks on the actual loop would be nice.
Lastly, if you cannot make it faster.. rather than trying to optimize reading the large xml document, you should look into ways where this 'slowness' is not an issue. Some ideas include an asynchronous process, proper caching, etc..
Edit
Are you actually calling getParentId for every document? This just occurred to me. If you have a 1000 documents then this would imply already 1000*1000 loops. If this is truly the case, you need to rewrite your code so it becomes a single loop.
How are you populating the array in the first place? Perhaps you could arrange the items in a hierarchy of nested arrays, where each key relates to one part of the path.
e.g.
['Shared Documents']
['Folder1']
['Yet another folder']
['folderA']
['folderB']
Then in your getParentId() method, extract the various parts of the path and just search that section of data:
private function getParentId($documents, $parentPath) {
$keys = explode('/', $parentPath);
$docs = $documents;
foreach ($keys as $key) {
if (isset($docs[$key])) {
$docs = $docs[$key];
} else {
return 0;
}
}
foreach $docs as $document) {
if ($parentPath == $document->ServerUrl) {
return $document->ID;
}
}
}
I haven't fully checked that will do what you're after, but it might help set you on a helpful path.
Edit: I missed that you're not populating the array yourself initially; but doing some sort of indexing ahead of time might still save you time overall, especially if getParentId is called on the same data multiple times.
As usual this was a matter of programming design. And there are a few lessons to be learned from this.
In a file system the parent is always a folder, to speed up such a process in php you can put all the folders in a separate array with it's corresponding ID as the key and search that array when you want to find the parent of a file, instead of searching the entire file structure array!
Packet downloaded in 6.9351849555969 seconds
Packet decoded in 1.2411289215088 seconds
Packet unpacked in 0.04874587059021 seconds
List data organized in 3.7993721961975 seconds
Standard properties filled in 0.4488160610199 seconds
Custom properties filled in 0.15889382362366 seconds
This page was created in 11.578738212585 seconds!
Compare the custom properties by the one from my original post
Cheers
I have an article that I want to publish on my Joomla! site. Every time I click apply or save. I get error 500 - An error has occurred! DB function reports no errors. I have no idea why this error comes up, al I can think is that it's a server error.
I'm using TinyMCE to type articles together with Joomla! 1.5.11.
Updated: I turned on Maximum error reporting in Joomla! and in the article manager I tried to save the article and got these couple of errors. Please check screenshot
I tried adding
<?php
ini_set('error_reporting', E_ALL);
error_reporting(E_ALL);
ini_set('log_errors',TRUE);
ini_set('html_errors',TRUE);
ini_set('display_errors',true);
?>
at the top of the index.php pages for Joomla! but it does not show any errors. I checked the error logs on the server and also no errors come up.
I managed to publish the article via phpMyAdmin but then something else happens. I try to access to article from the front end, by clicking on the link to the article, but only a blank page comes up.
This is really weird, since the error log does not show any information. So I assume the error needs to be coming from Joomla!
This happens if I add a print_r($_POST) before if (!$row->check()) { on /administrator/components/com_content/controller.php (around line 693)
Array
(
[title] => Test.
[state] => 0
[alias] => test
[frontpage] => 0
[sectionid] => 10
[catid] => 44
[details] => Array
(
[created_by] => 62
[created_by_alias] =>
[access] => 0
[created] => 2008-10-25 13:31:21
[publish_up] => 2008-10-25 13:31:21
[publish_down] => Never
)
[params] => Array
(
[show_title] =>
[link_titles] =>
[show_intro] =>
[show_section] =>
[link_section] =>
[show_category] =>
[link_category] =>
[show_vote] =>
[show_author] => 1
[show_create_date] => 0
[show_modify_date] => 0
[show_pdf_icon] =>
[show_print_icon] =>
[show_email_icon] =>
[language] =>
[keyref] =>
[readmore] =>
)
[meta] => Array
(
[description] => Test.
[keywords] => Test
[robots] =>
[author] => Test
)
[id] => 58
[cid] => Array
(
[0] => 58
)
[version] => 30
[mask] => 0
[option] => com_content
[task] => apply
[ac1e0853fb1b3f41730c0d52de89dab7] => 1
)
I had a bounty on this question, but the problem is still not resolved? link text
Any help will be appreciated!!
Here is a link to the article (text file with the source I got from TinyMCE) Article
I read this other question and saw that you can't post the article since it's confidential. Is it in "plain english", does it have html? Could you provide some more information? Joomla has some plugins that "filter" a lot of content. If you try to write "iframe" or "script" tags in Joomla TinyMCE it's going to be filtered, this is Joomla's way of providing security.
Did you try to disable TinyMCE filters? Go to "Plugin Manager", "Editor - TinyMCE 2.0" and change "Code cleanup" options to test.
Looking at your POST array, it looks like the body text of your post isn't being sent. This would suggest it's a problem on the front end. Can you check the name of the HTML element where you are typing the body text? If you could edit and show us the relevant parts of the HTML form that would help too.
edit: ok, that article you linked to is almost 150,000 bytes, so it might be that it's choking on it. If this is a once-off article that you probably won't have to edit too much, I'd recommend putting in some dummy text and then going into your database using phpMyAdmin or something and editing the text in the jos_content table. The introtext and fulltext columns are defined as MEDIUMTEXT so they should be able to hold up to about 16MB without hassle.
If writing and/or editing articles of this size is something you'll be doing often (and hence, don't want to go into the DB each time), then perhaps you'll have to look at the maximum post size allowed.
This error could occur when you use Firefox.
Try to reproduce using IE.
Regards
Simply do the following task
"""you can ask to your hosting provider to disable the function suhosin in php.ini. When this function is enable, is not possible to save large posts."""".
Shailedner Ahuja
My Web Developer
http://www.mywebdeveloper.in
Your article text is too big . The table might not be finding space to save this big . I would like suggest you to use the LONGTEXT
datatype. Check if it works for you .