How to make a user wait with Laravel - php

I'm using Laravel 5.3 and I've built an application of which we can think of as sort of like an e-commerce app. I essentially want to allow the user to leave a review for their order, inside the application (not by email), but have them wait X amount of days after their purchase. Like eBay almost...
But what is the usual convention for creating a waiting period before a user can perform an action like this? I've been thinking whether I should set up cron jobs to send the user a request after X amount of days or perhaps by logging this in the database somehow, though I imagine that to be database heavy when having lots of users.
Any thoughts would be appreciated.
Update: The idea is to force a wait time upon the user. They can't leave a review until after the waiting period.

As I understand your use case, you need a persistent storage method to remember how many days have passed since the purchase of the item. There are, in general, 4 methods of persistent storage:
1. Cookies:
Storage: Client Side (but secure since Laravel encrypts and signs all cookies)
Specific: Unique to user machine and domain
Persistence Time: Can last forever unless the user specifically deletes the cookie
Common Use Cases: Saved preferences
2. Session
Storage: Server Side (depends on driver)
Specific: Specific to user (but depends on driver)
Persistence Time: Usually expires in a couple of hours, but can be extended
Common Use Cases: Persistence layer from one request to another (like shopping cart, popup notifications after action triggers)
3. Caching
Storage: Server Side
Specific: Generally application specific (but can be user specific)
Persistence Time: Generally an hour to a couple of days
Common Use Cases: Application specific storage use cases (e.g. total number of hits, most popular pages, database query caching, view caching)
4. Database
Storage: Server Side
Specific: Can be user or app specific
Persistence Time: Forever until deleted
Common Use Cases: Longer term data persistence layer (e.g. user details)
Now, if you want to send an automated reminder to the users after a couple of days of the purchase, the persistence layer has to be server side. That rules out cookies and also caching would not be a good method (since caching is best used for app specific data, not user specific data). But if you're looking for the review to be triggered by a user, I would suggest using cookies.
Needless to say, automated reminders need to be triggered by cron jobs.
In Summary:
If you want automated reminders: use database + cron jobs
If you don't want automated reminders: use cookies

first you should create service for have commenting logic (for example just allow user comment his order after n day) ...
after it you can use queue for handle this action by this way :
after submit order you add a job to queue for N day later.
it that job u can email user and say to him that he can leave comment for his order...
adding job to queue can be an Event or can be an listener on order model ...
for more information :
https://laravel.com/docs/5.4/providers
https://laravel.com/docs/5.4/queues

I'd suggest a different design. You cannot know, really, when they've received their order, when they've opened it, used it, or enjoyed it. Maybe it was delivered next day and they used it immediately. Maybe it was delivered 7 days later and not touched for a week. You really need to accept both ends of this spectrum as legitimate. But I get that you don't want people reviewing orders they've not legitimately used. So here is my alternate suggestion.
Allow the user to submit a review at any time after purchase. They might submit a review 5 days, 5 hours, or even 5 minutes after ordering. What you limit, instead, is when the review becomes visible on your site. So you set a policy that says reviews are accepted at any time, but only visible 72 hours after package delivery. (Or any number of hours; you might even scale the number. Trusted users can see their review in 24 hours, but new users require 72.)
This approach simplifies the code you have to write: all you need to track is review submission date time, and limit the display of reviews based on said submission date time.

Related

Reading many API feeds and processing data in "one call" PHP and avoiding a timeout

Well a question I wanted to ask for a while but it is getting more and more the right time to ask it.
I am building a system in PHP that processes bookings for holiday facilities.
Every facility (hotel, motel, hostel, bead & breakfast etc etc) has it's own login and is able to manage its own bookings.
Now it is being ran on a system with a single database that separates the user data at the hand of the hostname & user & login.
Also the system is equipped with the option to import bookings provided by partner / resellers.
I have created a specific page for doing this.
Let's say :
cron.examplesystem.com
This URL is called by the Cron task every 5 minutes to check if there are any new bookings or any pending changes / cancellations or confirmations.
Now the issue
This is going just fine until now.
However we had one instance for every client. And generally every call had to process something between 3 to 95 bookings.
But as I now updated my system from one instance for every client I work with one instance for all clients.
So in the past :
myclient.myholidaybookings.example.com
I am now going to :
myholidaybookings.example.com
With one server to handle all clients instead of multiple servers for each client his own.
This will put a lot of stress on the server. And this in general is none of my worries since I have worked hard to make it manageable and scalable.
But I have no clue how to approach this.
Because let's say we have about 100 clients with each 3 - 95 bookings (average 49) we'll have 490 bookings or updates to process a time.
For sure we'll be having a timeout soon or later.
And this is what I want to prevent.
Now all kind of creative solutions are to be found. But what's best practice. I want to create a solution that is solid and doesn't have to be reworked half way going live.
Summary :
problem : I have a system that processes many API feeds in one call and I am sure we'll have a timeout when processing if the system gets populated with users
desired solution : a best practice approach in PHP how to handle and process many API feeds without worrying about a timeout when the user database is growing.

Building a live database (approach)

I just want a approach on how to build a database with live records, so don't just downvote. I don't expect any code.
At the moment I have a MySql database with about 2 thousand users, they are are getting more though. Each player/user has several points, which are increasing or decreasing by certain actions.
My goal is that this database gets refreshed about every second and the user with more points move up and others move down... and so on
My question is, what is the best approach for this "live database" where records have to be updated every second. In MySql I can run time based actions which are executing a SQL command but this isn't the greatest way I think. Can someone suggest a good way to handle this? E.g. other Database providers like MongoDB or anything else?
EDIT
This doesn't work client side, so I can't simply push/post it into the databse due some time based events. For explanation: A user is training his character in the application. This training (to get 1 level up) takes 12 hours. After the time is elapsed the record should be updated in the database AUTOMATICALLY also if the user doesn't send a post request by his self (if the user is not logged in) other users should see the updated data in his profile.
You need to accept the fact that rankings will be stale to some extent. Your predicament is no different than any other gaming platform (or SO rankings for that matter). Business decisions were put in place and constantly get reviewed for the level of staleness. Take the leaderboards on tags here, for instance. Or the recent change that has profile pages updated a lot more frequently, versus around 4AM GMT.
Consider the use of MySQL Events. It is built-in functionality that replaces the need for cron tasks. I have 3 event-related links off my profile page if interested. You could calculate ranks on a timed schedule (your tolerance for staleness) and the users' requests for them would be fast (faster than the below from Gordon). On the con-side, they are stale.
Consider not saving (writing) rank info but rather focus just on filling in the slots of your other data. And get your rankings on the fly. As an example, see this rankings answer here from Gordon. It is dynamic, runs upon request with at least at that moment non-staleness, and would not require Events.
Know that only you should decide what is tolerable for the UX.

tracking calls to a PHP WebService for each user/IP

Basicaly, I have an PHP webservice which will be available from:
website
mobile phone clients - (android/iphone)
Data is retreived in JSON format. Request is sent using GET method over HTTPS protocol. Data in database is kept in mongo database.
Here comes my main questions:
I want to get the statistics for each user - how many calls he is doing per-minute/hour - how to save this data? Will it be okay to save everything into mongo database? (1 day statistics = 43 millions rows) Is there a solution maybe which will keep data only for X days and then auto-trunc everything automatically?
I also want to get statistics for each IP - how many calls were made from it per-minute/hour/day - how to save this data? Will it be okay to save everything into mongo database? Will it not become too large?
Is it always possible to see an IP of an user who is making a call to a webservice? what about IPv6?
These 3 questions are the most interesting to me currently. I am planning on letting users use basic services without the need of loggin-in! My main concern is the database/file system performance.
Next comes the description about - what measures I am planning to use. How did I come to this solution and why these 3 questions above are essential. Feel free to ignore the text below if you are not interested in details :)
I want to protect my webservice against crawling I.e. somebody can pass parameters (which are not hard to guess) to get entire data off my databse :)
Here is an usage example: https://mydomain/api.php?action=map&long=1.23&lat=2.45
As you can see - I am already using a secure https protocol, in order to prevent accidental catching entire GET request. It also protects agains 'man in the middle' attacks. However, it doesn't stop attackers from getting into website and going through JS AJAX calls to get an actual request structure. Or decompiling entire android's .APK file.
After reading a lot of questions through out the internet - I came to the conclusion that there is no way of protecting my data entirely, but I think I have found an approach of making the life of a crawlers a lot harder!
And I need your advice on either - if this whole thing is worth implementing and what technologies shall be used in my case (see next).
Next comes the security measures against non-website (mobile device) use of a service for users which are not logged-in.
<?php
/*
Case: "App first started". User: "not logged in"
0. generate UNIQUE_APP_ID for APP when first started
1. send UNIQUE_APP_ID to server (or request new UNIQUE_APP_ID from server to write it on device)
1.1. Verify how many UNIQUE_APP_IDs were created from this IP
if more than X unique values in last Y minutes ->
ban IP temporary
ban all UNIQUE_APP_IDs created by IP during Y minutes (use delay to link them together).
but exclude UNIQUE_APP_IDs from ban if these are not showing strange behaviour ( mainly - no calls to API)
force affected users to log-in to continue to use the service or ask to retry later when ban wears off
else register UNIQUE_APP_ID on server
Note: this is a 'hard' case, as the IP might belong to some public Wi-Fi AP. Precautions:
* temporary instead of permanent ban
* activity check for each UNIQUE_APP_ID belonging to IP. There could be legit users who use the service from long time thus will not be affected by this check. Created Z time ago from this IP.
* users will be not be ever banned - rather forced to log-in, where more restrictive actions will be performed individually!
Now that the application is registered and all validations are passed:
2. Case: "call to API is made". User: "not logged-in"
2.1. get IP from which call is made
2.2. get unique app ID of client
2.3. verity ID against DB on server
if not exists -> reject call
2.4. check how many calls this particular ID did in the last X minutes
if more than X calls -> ban only this unique ID
2.5 check how many Ids were banned from this IP in the last X minutes. If more than Y then ban new calls for whole IP temporary
check if all banned IDs were created from the same IP, if yes then also ban that IP, if it is different from new IP
*/
?>
As you can see - my whole solution is based on the idea that I can store the data about webservice and retrieve it for analysis easily.. for each single webservice call.. Or maybe each X'th call. I have no idea about - what kind of database shall be used. I was thinking that mongo might not be the best choice. Maybe MySQL? Keeping data safe from wrong users is one reason. Another reason is that abusal usage of database will result in a huge load on a database.(DDos?) So, i think this might be a good idea to count webservice calls.
On the other side. A bit of calculations.
If there are 1000 users working simultaniusly. Each generating 30 calls to a webservice per minute. So it's 30000 disc writes in a minute. In hour it's 60 times that i.e. 1.800.000 disc writes in an hour. If I am planning to keep statistics about daily usage then it's 24 times that i.e. in
average there will be 43.200.000 records for tracking purposes kept on a server.
Each record contains information about: time + IP + user unique ID
I was also thinking about not storing any data at all. Use redis instead. I know that there is some kind of counter exists. For for each individual IP I can create a separate key and start counting calls. In this case everything will be kept in server's RAM. There is also an expire date parameter which is possible to set for each key. And for separate users I can store their IDs instead of network IP. This solution only came to my mind after I finished writing this whole essay, so let me hear your ideas about questions above.

Test Design: How do I simulate the passage of time to test the detection of and response to events scheduled in the future?

I am working on a web-based lease management application that needs to be able to generate various reports, reminders and alerts every day, based on information in the database at any given time. Some examples of the kind of reports, reminders and alerts include:
Send a transactional email letting a set of users know that their next invoice is due in 15 days.
Send a transactional email letting a set of users know that they have 1 or more past-due invoices
Alert a property manager that a particular property is X days past due and offer to print a set of eviction documents
etc.
It seems like the simplest approach is to define a collection of scripts that execute via a set of cron jobs every morning. Each script would check for the criteria needed to trigger a specific response from the system. For instance I may have a collection of scripts like, SendInvoiceDueIn15DaysEmail.php, SendInvoicePastDue30DaysEmail.php, etc.
My primary question is, given a database filled with test data, what is the best way to simulate the passage of time, say 90 days, to ensure that the data triggers the correct set of responses each day? Some of my daily tasks need to interact with third party APIs like Mandrill, Mail Chimp, some industry-specific accounting packages, etc.
My secondary question is, if anybody has experience developing applications that center around scheduled, recurring events that happen in the future, am I on the right track here? I've already built most of the core system (user management, property management, lease management), now it's time to test the automated side of things.
For what it's worth the core of application is using Laravel 4, but I don't think is strictly a PHP question.
TL;DR How do I go about simulating the passage of time in order to check that over a 90 day period the system correctly detects a set of events and triggers an appropriate response which completed successfully?
I think the answer here is the same as in everything else you wish to test - a Clock mock. Abstract the way you check what time it is right now. Then you could create a mock clock implementation that would work much faster (or report whatever you'd like) during tests and another implementation that would simply return the true time in production.
This way you could also test other scenarios like time changes on the server or Daylight saving time.

if uses haven't accept payment, unset session after 5-10 min. PHP

I', working with a small webshop, (includes a simpel online store - no users requires).
I need to find out if an session user haven't "accepted" the payment after.. 5-10 min.
If not, I have to "roll-back" the store and set the current shop sessions to null. Any helps how to do it? Or examples would be great..
thx for now
One option is to have a last_activity field in your database and check it/update it on each users page loads
Another option is to have a script run every 10 minutes or so via a cron job that cleans up stale sessions
If possible, decouple payment transactions from the store's state; you shouldn't be rolling back the entire store on non-payment, just the transaction / order. Also, a time lock isn't the best way of doing things like this as real-life happenings, internet connection speeds, and general distractions all run the risk of creating situations where a buyer's payment is reversed by mistake. If they are impatient you may lose the sale...
However, without more info about the system in question it's hard to suggest an alternative, so I'd say use a cron job to remove any transactions older than 10 mins that haven't been set to ACCEPTED.
I'm not sure, but i think what i'm looking for is a cronjob something like this link

Categories