Set cache time in symfony - php

I am using file cache in symfony to store my data for some time limit.Below is the code i have written.
$c = new sfFileCache(array('cache_dir' => sfConfig::get('sf_cache_dir').'/function'));
if ($c->has('myarray')) {
$cached = $c->get('myarray');
if (!empty($cached)) {
$data = unserialize($cached);
}
} else {
foreach($queries as $key => $query) {
foreach ($query->fetchArray() As $result) {
$data[] = $result;
}
}
$c->set('myarray',serialize($data));
}
Can anybody tell how to set time limit for file cache in symfony so that the cache will be automatically destroyed after an hour.

Simply:
$c = new sfFileCache(array(
'cache_dir' => sfConfig::get('sf_cache_dir').'/function',
'lifetime' => 3600
));
See the code to learn about other option from sfCache.

just sharing my code for the ones using APC. should be quite the same. i just passed the prefix "query" since i was caching a query.
$cache = new sfAPCCache(array('lifetime' => 600, 'prefix' => 'query'));

Related

update_post_meta and update_field not working without any error received

I am working with a wordpress site that imports all data from API to the site automatically via cron job. However, I'm now on the part of saving the field data from the API. The problem is update_post_meta and update_fields are both not working.
I already tried interchanging between the two methods of saving but both doesn't work. No error prompts and no results as well (which is pretty weird for me). I checked the built-in plugin of the site and it uses update_post_meta.
add_action('wp_ajax_nopriv_get_properties_from_api','get_properties_from_api');
add_action('wp_ajax_get_properties_from_api','get_properties_from_api')
function get_properties_from_api(){
$file = get_stylesheet_directory() . '/report.txt';
$current_page=(! empty($_POST['current_page'])) ? $_POST['current_page'] : 1;
$properties = [];
$results = wp_remote_retrieve_body(wp_remote_get('https://www.realestateview.com.au/listing_api?rm=search&company=castlemain&code=29GKRRSgkVdQVM&CID=5813&json=1&ptr=r&con=S&portalview=residential&rn=1&pg='. $current_page));
file_put_contents($file, "Current page: ". $current_page. "\n\n", FILE_APPEND);
$results=json_decode($results, true);
if(!is_array($results['Listings']) || empty($results['Listings'])){
return false;
}
$properties[]=$results;
foreach($properties[0] as $property){
$property_slug = sanitize_title($property->TitleNoHTML, '-', $property->OrderID);
$inserted_property = wp_insert_post([
'post_name' => $property_slug,
'post_title' => $property->TitleNoHTML,
'post_type' => 'property',
'post_status' => 'publish',
]);
if(is_wp_error($inserted_property)){
continue;
}
$fillable=[
//Basic information
get_the_title($inserted_property) => 'TitleNoHTML',
'REAL_HOMES_property_price' => 'PriceText',
'REAL_HOMES_property_size' => 'LandSizeText',
'REAL_HOMES_property_bedrooms' => 'BedroomsCount',
'REAL_HOMES_property_bathrooms' => 'BathroomsCount',
'REAL_HOMES_property_garage' => 'LockUpGaragesCount',
'REAL_HOMES_featured' => 'FeaturedProperty',
//$this->REAL_HOMES_property_id =
//$this->REAL_HOMES_property_year_built =
//Location on Map
'REAL_HOMES_property_address' => 'AddressText',
'REAL_HOMES_property_location' => 'Suburb',
'REAL_HOMES_property_map' => 'DisplayTrueAddress',
//Gallery
'REAL_HOMES_property_images' => 'PhotoOriginalURL',
//Floor Plans
//$this->inspiry_floor_plan_name =
'inspiry_floor_plan_price' => 'PriceText',
//$this->inspiry_floor_plan_price_postfix =
// $this->inspiry_floor_plan_size =
// $this->inspiry_floor_plan_size_postfix =
'inspiry_floor_plan_bedrooms' => 'BedroomsCount',
'inspiry_floor_plan_bathrooms' =>'BathroomsCount',
// $this->inspiry_floor_plan_descr =
'inspiry_floor_plan_image' => 'FloorplanThumbURL',
//Property Video
'inspiry_video_group_image' => 'PhotoThumbURL',
//$this->inspiry_video_group_title =
'inspiry_video_group_url' => 'VideoURL',
//DEPRECATED FIELDS
// $this->REAL_HOMES_360_virtual_tour =
// $this->REAL_HOMES_tour_video_url_divider =
// $this->REAL_HOMES_tour_video_url =
// $this->REAL_HOMES_tour_video_image =
//Agent
//$this->REAL_HOMES_agent_display_option =
'REAL_HOMES_agents' => 'ContactAgentName',
//Energy Performance
// $this->REAL_HOMES_energy_class =
// $this->REAL_HOMES_energy_performance =
// $this->REAL_HOMES_epc_current_rating =
// $this->REAL_HOMES_epc_potential_rating =
//Misc
// $this->REAL_HOMES_sticky =
// $this->inspiry_property_label =
// $this->inspiry_property_label_color =
// $this->REAL_HOMES_attachments =
'inspiry_property_owner_name' => 'ClientName',
//$this->inspiry_property_owner_contact =
'inspiry_property_owner_address' => 'ClientAddress',
// $this->REAL_HOMES_property_private_note =
// $this->inspiry_message_to_reviewer =
//Homepage slider
// $this->REAL_HOMES_add_in_slider =
// $this->REAL_HOMES_slider_image =
// $this->REAL_HOMES_page_banner_image =
//Additional fields
'inspiry_InspectionDateandStartTime' => 'ISOInspectionStart',
'inspiry_InspectionDateandFinishTime' => 'ISOInspectionFinish',
];
foreach($fillable as $key => $TitleNoHTML){
update_post_meta($inserted_property, $key, $property->$TitleNoHTML);
}
}
$current_page = $current_page + 1;
wp_remote_post(admin_url('admin-ajax.php?action=get_properties_from_api'), [
'blocking' => false,
'sslverify' => false,
'body' => [
'current_page' => $current_page
]
]);
What I'm already expecting is it should be already save some data, if not, it should produce an error but for some weird reason, there isn't. I tried to var_dump some variables and I think it should be working. Would anyone be able to help me find out where I gone wrong?
In order to replace the Wordpress cron with a real cron job you will need to set a cron job which will fetch data from a webpage using wget. First you will need to create a Wordpress page which will contain your PHP code then fetch the content with cron using wget.
The real cron job command will look like this:
wget -q -O - http://yourdomain.com/your_cron_page >/dev/null 2>&1
-q tells wget to operate quietly (ie. to not output the usual status information)
-O /dev/null tells it to output to /dev/null
Keep in mind that everyone can access this page so you might want to set some restrictions.

Array becoming multi-dimensional in AJAX request

So I am creating a function in WordPress which counts and sets a user session and storing its values in the user's local storage. I was able to make it work perfectly by using cookies and when the site is hosted locally, for some reason, it is not working when I uploaded it on the staging site. So I am trying implement this function using another approach and decided to use local storage instead.
There's a problem with the Array values that the function is generating and I have spent almost the entire day trying to debug the problem. It is generating multi-dimensional instead of a single one.
Here's my function code:
function monitor_post_views() {
$page_id = 'page' . $_POST['page_id'];
$timestamp = time();
// 30 minutes timeout
$timeout = 1800;
// Serves as my guide for debugging, will not include in the final code
$message = '';
if ( ! empty($_POST['page_id_array']) ) {
//Checks if values from local storage exist
//Gets the stored Array coming from AJAX call
$page_id_array[] = json_decode(stripslashes($_POST['page_id_array']), true);
if ( in_array_r($page_id_array, $page_id) ) {
//Check if current page is found in array
$message = 'FOUND IN ARRAY CHECKING !!!!';
$temp= [];
$page_id_array_temp = array('id' => $page_id, 'expiration' => $timestamp, 'message' => $message);
$temp = $page_id_array_temp;
//Pushes the generated array inside the $page_id_array
array_push($page_id_array, $temp);
print_r(json_encode($page_id_array));
foreach ( $page_id_array as $page ) {
//If page is in array, check if the session is expired, if not, do nothing, if expired, update and then run the view function
}
} else {
// ID Not found in Array, Insert a new entry
$message = 'ID NOT FOUND IN ARRAY, CREATING ENTRY !!!';
$temp = [];
$page_id_array_temp = array('id' => $page_id, 'expiration' => $timestamp, 'message' => $message);
$temp = $page_id_array_temp;
//Pushes the generated array inside the $page_id_array
array_push($page_id_array, $temp);
print_r(json_encode($page_id_array));
//Set post view function here base on $_POST['page_id']
}
} else {
//Not found in local storage, need to create one
$message = 'CREATING A NEW ENTRY !!!!';
$temp = [];
$page_id_array = array('id' => $page_id, 'expiration' => $timestamp, 'message' => $message);
$temp = $page_id_array;
print_r(json_encode($temp));
//Set post view function here base on $_POST['page_id']
}
wp_die();
}
add_action('wp_ajax_monitor_post_views', 'monitor_post_views');
add_action('wp_ajax_nopriv_monitor_post_views', 'monitor_post_views');
Here's a screenshot of what this function generates
Array
Here's a sample JSON
[[{"id":"page1202","expiration":1551125579,"message":"FOUND IN ARRAY CHECKING !!!!"},{"id":"page1206","expiration":1551125613,"message":"ID NOT FOUND IN ARRAY !!!! INSERTING ENTRY !!!"}],{"id":"page1296","expiration":1551125624,"message":"ID NOT FOUND IN ARRAY !!!! INSERTING ENTRY !!!"}]
I was trying to generate a one dimensional but ended up with this.
Any thoughts? Thanks in advance
The problem is you are creating arrays too many times:
Change $page_id_array and $page_id_array_temp to
$page_id_array=new \stdClass();//no need to declare as an array
replace
$page_id_array_temp = array('id' => $page_id, 'expiration' => $timestamp, 'message' => $message);
with
$page_id_array->id=$page_id;
$page_id_array->expiration=$timestamp;
$page_id_array->message=$message;
also change
$temp = [];
you can use it directly
//no need to declare $temp as an array
$temp=$page_id_array;

How to upload large files (around 10GB)

I want to transfer to my Amazon S3 bucket an archive of around 10GB, using a PHP script (it's a backup script).
I actually use the following code :
$uploader = new \Aws\S3\MultipartCopy($s3Client, $tmpFilesBackupDirectory, [
'Bucket' => 'MyBucketName',
'Key' => 'backup'.date('Y-m-d').'.tar.gz',
'StorageClass' => $storageClass,
'Tagging' => 'expiration='.$tagging,
'ServerSideEncryption' => 'AES256',
]);
try
{
$result = $uploader->copy();
echo "Upload complete: {$result['ObjectURL']}\n";
}
catch (Aws\Exception\MultipartUploadException $e)
{
echo $e->getMessage() . "\n";
}
My issue is that after few minutes (let's say 10mn), I receive an error message from the apache server : 504 Gateway timeout.
I understand that this error is related to the configuration of my Apache server, but I don't want to increase the timeout of my server.
My idea is to use the PHP SDK Low-Level API to do the following steps:
Use Aws\S3\S3Client::uploadPart() method in order to manually upload 5 parts, and store the response obtained in $_SESSION (I need the ETag values to complete the upload);
Reload the page using header('Location: xxx');
Perform again the first 2 steps for the next 5 parts, until all parts are uploaded;
Finalise the upload using Aws\S3\S3Client::completeMultipartUpload().
I suppose that this should work but before to use this method, I'd like to know if there is an easier way to achieve my goal, for example by using the high-level API...
Any suggestions?
NOTE : I'm not searching for some existing script : my main goal is to learn how to fix this issue :)
Best regards,
Lionel
Why not just use the AWS CLI to copy the file? You can create a script in the CLI and that way everything is AWS native. (Amazon has a tutorial on that.) You can use the scp command:
scp -i Amazonkey.pem /local/path/backupfile.tar.gz ec2-user#Elastic-IP-of-ec2-2:/path/backupfile.tar.gz
From my perspective, it would be easier to do the work within AWS, which has features to move files and data. If you'd like to use a shell script, this article on automating EC2 backups has a good one, plus more detail on backup options.
To answer my own question (I hope it might help someone one day!), he is how I fixed my issue, step by step:
1/ When I load my page, I check if the archive already exists. If not, I create my .tar.gz file and I reload the page using header().
I noticed that this step was quite slow since there is lot of data to archive. That's why I reload my page to avoid any timeout during the next steps!
2/ If the backup file exists, I use AWS MultipartUpload to send 10 chunks of 100MB each. Everytime that a chunk is sent successfully, I update a session variable ($_SESSION['backup']['partNumber']) to know what is the next chunk that needs to be uploaded.
Once my 10 chunks are sent, I reload the page again to avoid any timeout.
3/ I repeat the second step until the upload of all parts is done, using my session variable to know which part of the upload needs to be sent next.
4/ Finally, I complete the multipart upload and I delete the archive stored locally.
You can of course send more than 10 times 100MB before to reload your page. I chose this value to be sure that I won't reach a timeout even if the download is slow. But I guess I could send easilly around 5GB each time without issue.
Note: You cannot redirect you script to itself too much time. There is a limit (I think it's around 20 times for Chrome and Firefoxbefore to get an error, and more for IE). In my case (the archive is around 10GB), transfering 1GB per reload is fine (the page will be reloaded around 10 times). But it the archive size increases, I'll have to send more chunks each time.
Here is my full script. I could surely be improved, but it's working quite well for now and it may help someone having similar issue!
public function backup()
{
ini_set('max_execution_time', '1800');
ini_set('memory_limit', '1024M');
require ROOT.'/Public/scripts/aws/aws-autoloader.php';
$s3Client = new \Aws\S3\S3Client([
'version' => 'latest',
'region' => 'eu-west-1',
'credentials' => [
'key' => '',
'secret' => '',
],
]);
$tmpDBBackupDirectory = ROOT.'Var/Backups/backup'.date('Y-m-d').'.sql.gz';
if(!file_exists($tmpDBBackupDirectory))
{
$this->cleanInterruptedMultipartUploads($s3Client);
$this->createSQLBackupFile();
$this->uploadSQLBackup($s3Client, $tmpDBBackupDirectory);
}
$tmpFilesBackupDirectory = ROOT.'Var/Backups/backup'.date('Y-m-d').'.tar.gz';
if(!isset($_SESSION['backup']['archiveReady']))
{
$this->createFTPBackupFile();
header('Location: '.CURRENT_URL);
}
$this->uploadFTPBackup($s3Client, $tmpFilesBackupDirectory);
unlink($tmpDBBackupDirectory);
unlink($tmpFilesBackupDirectory);
}
public function createSQLBackupFile()
{
// Backup DB
$tmpDBBackupDirectory = ROOT.'Var/Backups/backup'.date('Y-m-d').'.sql.gz';
if(!file_exists($tmpDBBackupDirectory))
{
$return_var = NULL;
$output = NULL;
$dbLogin = '';
$dbPassword = '';
$dbName = '';
$command = 'mysqldump -u '.$dbLogin.' -p'.$dbPassword.' '.$dbName.' --single-transaction --quick | gzip > '.$tmpDBBackupDirectory;
exec($command, $output, $return_var);
}
return $tmpDBBackupDirectory;
}
public function createFTPBackupFile()
{
// Compacting all files
$tmpFilesBackupDirectory = ROOT.'Var/Backups/backup'.date('Y-m-d').'.tar.gz';
$command = 'tar -cf '.$tmpFilesBackupDirectory.' '.ROOT;
exec($command);
$_SESSION['backup']['archiveReady'] = true;
return $tmpFilesBackupDirectory;
}
public function uploadSQLBackup($s3Client, $tmpDBBackupDirectory)
{
$result = $s3Client->putObject([
'Bucket' => '',
'Key' => 'backup'.date('Y-m-d').'.sql.gz',
'SourceFile' => $tmpDBBackupDirectory,
'StorageClass' => '',
'Tagging' => '',
'ServerSideEncryption' => 'AES256',
]);
}
public function uploadFTPBackup($s3Client, $tmpFilesBackupDirectory)
{
$storageClass = 'STANDARD_IA';
$bucket = '';
$key = 'backup'.date('Y-m-d').'.tar.gz';
$chunkSize = 100 * 1024 * 1024; // 100MB
$reloadFrequency = 10;
if(!isset($_SESSION['backup']['uploadId']))
{
$response = $s3Client->createMultipartUpload([
'Bucket' => $bucket,
'Key' => $key,
'StorageClass' => $storageClass,
'Tagging' => '',
'ServerSideEncryption' => 'AES256',
]);
$_SESSION['backup']['uploadId'] = $response['UploadId'];
$_SESSION['backup']['partNumber'] = 1;
}
$file = fopen($tmpFilesBackupDirectory, 'r');
$parts = array();
//Reading parts already uploaded
for($i = 1; $i < $_SESSION['backup']['partNumber']; $i++)
{
if(!feof($file))
{
fread($file, $chunkSize);
}
}
// Uploading next parts
while(!feof($file))
{
do
{
try
{
$result = $s3Client->uploadPart(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $_SESSION['backup']['uploadId'],
'PartNumber' => $_SESSION['backup']['partNumber'],
'Body' => fread($file, $chunkSize),
));
}
}
while (!isset($result));
$_SESSION['backup']['parts'][] = array(
'PartNumber' => $_SESSION['backup']['partNumber'],
'ETag' => $result['ETag'],
);
$_SESSION['backup']['partNumber']++;
if($_SESSION['backup']['partNumber'] % $reloadFrequency == 1)
{
header('Location: '.CURRENT_URL);
die;
}
}
fclose($file);
$result = $s3Client->completeMultipartUpload(array(
'Bucket' => $bucket,
'Key' => $key,
'UploadId' => $_SESSION['backup']['uploadId'],
'MultipartUpload' => Array(
'Parts' => $_SESSION['backup']['parts'],
),
));
$url = $result['Location'];
}
public function cleanInterruptedMultipartUploads($s3Client)
{
$tResults = $s3Client->listMultipartUploads(array('Bucket' => ''));
$tResults = $tResults->toArray();
if(isset($tResults['Uploads']))
{
foreach($tResults['Uploads'] AS $result)
{
$s3Client->abortMultipartUpload(array(
'Bucket' => '',
'Key' => $result['Key'],
'UploadId' => $result['UploadId']));
}
}
if(isset($_SESSION['backup']))
{
unset($_SESSION['backup']);
}
}
If someone has questions don't hesitate to contact me :)

Server keeps crashing on heavy load (facebook SDK + Laravel 5.2 + AWS)

I'm on the free tier of AWS and using the laravel framework and the facebook v.2.5 SDK (Web). I'm trying to get the latests 10 posts from facebook for approximately 600 users. Which would be 6000 posts max. Every time I run the query it runs through about 10 loops and then the app completely crashes and goes offline. Then returns after a few minutes. Laravel isn't showing me any errors.
My code is:
/**
* Get facebook users posts
* #return \SammyK\LaravelFacebookSdk\LaravelFacebookSdk;
*/
public function posts(\SammyK\LaravelFacebookSdk\LaravelFacebookSdk $fb)
{
// get posts
$profiles_to_get = DB::table('facebook_profiles')->distinct('username')->get();
$fb_admin_profile = DB::table('profiles')->where('social_media_type', "facebook")->first();
$admin_fb_access_token = $fb_admin_profile->oauth_token;
foreach ($profiles_to_get as $profile_to_get) {
try {
$response = $fb->get('/'.$profile_to_get->username.'?fields=posts.limit(10)', $admin_fb_access_token);
$userNode = $response->getGraphUser();
$posts = json_decode($userNode['posts']);
foreach ($posts as $post)
{
isset($post->message) ? $fb_posts[] = array('account_id' => $profile_to_get->id,
'facebook_id' => $userNode->getID(),
'message_id' => $post->id,
'message' => $post->message,
'created_time' => $post->created_time->date,
'created_at' => Carbon::now(),
'updated_at' => Carbon::now(),
) : null;
foreach ($fb_posts as $fb_post)
{
$postDuplicateChecker = DB::table('facebook_posts')->where('message_id', $fb_post['message_id'])->get();
if($postDuplicateChecker == !null)
{
DB::table('facebook_posts')->where('message_id', $fb_post['message_id'])->update($fb_post);
$notification = "First notification";
}
else
{
DB::table('facebook_posts')->insert( $fb_post );
$notification = "Second notification";
}
}
if ($post > 0 && $post % 10 == 0)
{
sleep(5);
}
}
} catch(\Facebook\Exceptions\FacebookSDKException $e) {
dd($e->getMessage());
}
}
return Redirect::route('someroute',[ 'notification' => $notification]);
}
I've tried setting the query timeout at 300 so it doesn't time out and also making the loop sleep after every 10 requests so that it reduces the load. Also I have other apps running on the same server but they never go offline when this app crashes.
My question is is there any way to optimize the code so that I don't have to upgrade the server or is my only choice to upgrade the server?
The answer was to batch the query using array_chunk and chunk the process into smaller pieces which could be handled easier.
array array_chunk ( array $array , int $size [, bool $preserve_keys = false ] )
Reference: http://php.net/manual/en/function.array-chunk.php

connection between android app and cakephp

Hi I'm making a web service in cakephp for an android app. I am getting the request and the respose is being send but the response is not visible on the client's end. My code is as shown below. Can there be some other method to send the response.
public function AndroidApp() {
if (isset($_POST["myHttpData"])) {
$coupon = trim($_POST["myHttpData"]);
$couponId = $this->Code->find('all', array(
'conditions' => array(
'Code.coupon_code' => $coupon,
'Code.status' => 'Used'
),
'fields' => array('Code.id')));
$studentAssessmentId = $this->StudentAssessment->find('all', array(
'conditions' => array(
'StudentAssessment.code_id' => $couponId[0]['Code']['id'],
'StudentAssessment.status' => 'Complete'
),
'fields' => array('StudentAssessment.id')));
$scores = $this->AssessmentScore->find('all', array(
'conditions' => array(
'AssessmentScore.student_assessment_id' => $studentAssessmentId[0]['StudentAssessment']['id']
),
'fields' => array('AssessmentScore.score')));
$json = array();
$assessment_data = array();
//debug($scores);
$i = 0;
foreach ($scores as $score) {
$assessment_data[$i] = array("score" => $score['AssessmentScore']['score']);
$i+=1;
}
header('Content-type: application/json');
$json['success'] = $assessment_data;
$android = json_encode($json);
} else {
$json['error'] = "Sorry, no score is available for this coupon code!";
$android = json_encode($json);
}
echo $android;
code smell, non-cakephp standards
First of all, as mentioned in comments by others, you're not using the CakePHP request/response objects. Because of this, you're overly complicating things. See the documentation here;
http://book.cakephp.org/2.0/en/controllers/request-response.html
http://book.cakephp.org/2.0/en/controllers/request-response.html#dealing-with-content-types
And
http://book.cakephp.org/2.0/en/views/json-and-xml-views.html
The $scores loop to reformat the query results is probably redundant if you replace the find('all') with find('list'), using 'score' as display field. See the documentation here http://book.cakephp.org/2.0/en/models/retrieving-your-data.html#find-list
bugs
There also seems to be some bugs in your code;
the content-type header is only sent if $_POST["myHttpData"] is present.
you're only checking if $_POST["myHttpData"] is present, not if it actually contains any data (empty)
you're not checking if the various queries return a result. This will cause errors in your code if a query did not return anything! For example, you assume that $couponId[0]['Code']['id'] is present (but it won't be if the coupon-code was not found)
possible answer
Apart from these issues, the most probable cause for your problem is that you did not disable 'autoRender'. Therefore CakePHP will also render the view after you've output your JSON, causing a malformed JSON response.
public function AndroidApp() {
$this->autoRender = false;
// rest of your code here
}

Categories