how does the cache in laravel work?
I use the function Cache::remember to save for 5 seconds the response to the request, and when the time expires it will be updated again, but this does not happen for some reason
cache driver - redis
without caching the function gives fresh results, but with caching - the old results
code example
public function last_win()
{
return Cache::remember("last-wins", config("cache-config.cache_time.last-wins"), function () {
return GameBet::with("getGame")
->where('created_at', '<=', Carbon::now())
->where("win", ">", 0)
->orderBy("created_at", "desc")
->orderBy("id", "desc")
->simplePaginate(25)
->toArray();
});
}
in config:
"cache_time" => [
"last-wins" => env("CACHE_TIME_LAST_WINS", 5), // 5 sec
],
please tell me what i'm doing wrong:)
Related
I have two mobile applications: Android and iOS. Both make requests to a laravel backend.
I just added a little form inside the app and tried to save it on the DB. The problem is that whenever I try to save it on iOS i get this error:
PDOException: SQLSTATE[40001]: Serialization failure: 1213 Deadlock found when trying to get lock; try restarting transaction in /var/app/current/vendor/laravel/framework/src/Illuminate/Database/Connection.php:480
On Android, no matter how hard I try, the error is never thrown.
Specifically the exception is thrown by an update query and no jobs should be touching that table at any time.
This is the Controller function that throws the error (exact row highlighted):
public function questionnaireUpdate($trainingId, Request $request)
{
$training = Auth::user()->club->trainings()->findOrFail($trainingId);
$workloads = Workload::getAppWorkloads();
$user = Auth::user();
foreach ($workloads as $key => $w) {
$workload = $training->workloads()
->where('workload_id', '=', $w->id)
->where('user_id', '=', Auth::user()->id)
->first();
if ($workload) {
DB::table('athlete_workload')
->where('training_set_id', '=', $training->id)
->where('workload_id', '=', $w->id)
->where('user_id', '=', Auth::user()->id)
->update([
This is the row ====> 'value' => floatval($request->get(WorkloadHelper::getQuestionnaireSlug($w))), <=== This is the row
]);
} else {
// Performs an insert
}
}
WorkloadHelper::updateUserSumAndAverageWorkloadValues(Auth::user()->id, $trainingId);
return new JsonResponse(['message' => 'Questionnaire updated'], 200);
}
I tried logging the requests, the values that get passed to the update functions and even the query they are executing with ->toSql(), but they are exactly the same. Any idea of what could be causing this? And why only iOS?
I have a category table in which I have different categories of deals. Each of them consists of many deals along with its expiry date. I want to access only those deals with their categories whose expiry date is not over, but I am getting an issue that if any deal of category exists in time range all of its deals arrive whether it's expired or not. Here is my code:
$deals = DealCategory::where('name', '!=', 'Today Deal')
->whereRelation('deals','start_date', '<=', date('Y-m-d'))
->whereRelation('deals', 'expiry_date',">=", date('Y-m-d'))
->with('deals', 'deals.deal_images', 'deals.deal_products', 'deals.deal_products.product', 'deals.rating')->latest()->Paginate(12);
return response()->json(['Deals' => $deals, 'Date' => Carbon::now(), 'status' => 'success'], 200);
When you eagerly load relations using with, you can pass additional criteria to tell Eloquent which records to load:
DealCategory::where('name', '!=', 'Today Deal')
->whereRelation('deals','start_date', '<=', date('Y-m-d'))
->whereRelation('deals', 'expiry_date',">=", date('Y-m-d'))
->with(['deals' => function ($query) {
$query->where('start_date', '<=', date('Y-m-d'));
$query->where('expiry_date',">=", date('Y-m-d'));
$query->with('deal_images', 'deal_products', 'deal_products.product', 'rating');
}])
->latest()->Paginate(12);
Latest versions of Laravel even include a dedicated withWhereHas method to check for the existence of a relationship while simultaneously loading the relationship based on the same conditions:
DealCategory::where('name', '!=', 'Today Deal')
->withWhereHas('deals', function ($query) {
$query->where('start_date', '<=', date('Y-m-d'));
$query->where('expiry_date',">=", date('Y-m-d'));
$query->with('deal_images', 'deal_products', 'deal_products.product', 'rating');
})
->latest()->Paginate(12);
Either option should do what you need.
I am working on a project but I am not quite sure how to update a single record stored in the cache, for example:
PostController.php
class PostController extends Controller
public function index(){
$page = request('page', 1);
$posts = Cache::rememberForever('posts_page_'.$page, function(){
return Posts::query()
->with('category')
->latest()
->paginate(30);
});
}
...
...
}
Here is the store method:
public function store(PostCreateRequest $request)
{
$post = Post::create([
'title' => $request->title,
'body' => $request->body,
...
...
'category_id' => $request->category_id
]);
....
....
Cache::flush();
return redirect()->route('posts.index')
->with('success', 'post created');
}
I need to do Cache::flush(); to clean all the posts cache, because I have this keys in the cache: posts_page_1, posts_page_2, posts_page_3, etc, category_name1_page_1, category_name1_page_2, category_name2_page_1, etc..
But I am adding only one post, the same occurs when I update only one post, I have to clean up all the cache, what would happen if I had thousands or millions records? that Cache::flush() would take to much time, I would like to know how to add or update one specific record in the cache, I am using currently the file driver for caching. Thanks.
The problem here is that your are not storing many posts your are storing the result of a query, so any time you use the index() method you are retrieving a cached response.
First you can face this "issue" by avoiding the use of rememberForever().
Why? Because every time you have to manually clear the cache, if you forgot items you can easily end with many MB of data forgotten in the cache.
Maybe changing your method to:
$seconds = 300; // A default clearing time
Cache::remember('posts_page_'.$page, $seconds, function(){
return Posts::query()
->with('category')
->latest()
->paginate(30);
});
Also in the docs says many ways to clear cache. Maybe use:
Cache::forget('posts_page_'.$page);
what is error when i want to fetching data with paginate(10) the Vue js dosn't do it
but when i use paginate(5) it working good
this code of Controller with relationship in model files
and the response status 200 ok working
$results = Posts::with(['comment'])
->orderBy('created_at', 'desc')
->paginate(5);
return response()
->json(['results' => $results]);
this code is actually worked for me but i want to make 10 results in my page
Like this
$results = Posts::with(['comment'])
->orderBy('created_at', 'desc')
->paginate(10);
return response()
->json(['results' => $results]);
with ->paginate(10)or > 5 not giving any data and get error on console with Vue js but the response is ok 200
Try this:
$results = Posts::with('comment')->orderBy('created_at', 'desc')->paginate(10);
return response()->json(['results' => $results]);
i use laravel datatables serverside to get data from 4 tables and display them.
The code that i have on routes/web.php is:
Route::get('/serverSideSymv', [
'as' => 'serverSideSymv',
'uses' => function () {
$symv = App\Symvolaia::Select('table1.*')
->join('table2', 'table1.field1', '=', 'table2.id')
->join('table3', 'table1.field2', '=', 'table3.id')
->leftJoin('table4', function($join)
{
$join->on('table1.field3', '=', 'table4.field5');
$join->on('table1.field6', '=', 'table4.field7');
})
->select('table1.id','table2.field1','table1.field2','table1.field3','table1.field4','table3.field5','table1.field6','table1.field7','table1.field8','table4.field9','table1.field10','table1.field11','table1.field12','table1.field13','table1.field14','table2.field15');
return Datatables::of($symv)
->editColumn('field5', function($data) {
return Illuminate\Support\Facades\Crypt::decrypt($data->field5);
})
->make();
}
]);
The issue is that the main table (Table1) has more than 20.000 records and will have much more. I get max execution time exceeded. Is there any way to optimize the query or somehow show the results in datatables?