In our app, users can upload photographs. This upload event is registered as an activity and pushed into that users 'followers' activity streams.
The flow and tech used is as follows:
MYSQL backend for storage (url of uploaded image is stored in photos table along with some metadata)
Activity is generated and stored in activities table in MYSQL
GEARMAN job is created which sends the activity ID to REDIS where this activity ID is fanned-out to all the users followers in order to populate their activity streams.
We retrieve a stream by getting the array of activity ID's from REDIS and then performing a simple IN query in MYSQL.
In a users activity stream a small cropped version of the image is visible.
Now this all works fabulously and along with several caching layers has helped keep the system very stable and scaleable.
However, what is considered the best way to deal with a user who chooses to delete one or more of their photos. Currently, when they delete a photo all their followers will get a broken image link in their activity streams.
Therefore we need to choose between two methods of dealing with this.
Store the activity id in the photo table, and delete the activity relating to the photo when a user deletes that photo. This would work nicely but would see a user have activities removed from their streams occasionally.
Flag the image as deleted in the photos table and when generating the stream display a "this image has been deleted by the user" message.
Which would be considered the best approach to take and why? Alternatively are there any other better ways of dealing with this that people would recommend?
Many thanks.
IMO it would be best to make this as transparent as possible. Tell the other users that the image was deleted and offer an option to drop the entry from the activity stream. That way the user stays in control and has no "magick" happening in the background. Maybe for very old entries (what ever this means in current times) you can remove the item automatically. But I would always prefer to get the respective information.
Related
I would like to create e-learning platform. So users will have a lot of things to choose (mostly available to view only for them) like:
add note
add movies to favorite
rate the instructor
And few options that auto save for each user like:
unanswered questions
wrong answer questions
movies in progress (user saw only 2 min from 5)
So what database or method I schould use for store that kind of data?
I do not want to use cookies because it needs to be save on user account and not on browser. User need to have that all on every browser or mobile device.
I wondering about json but...if I do so each user I'd will be available to view...so schould I use MySQL?
I would recommend that you build your own data logger, what i mean by this is build yourself a place to store every users data like an eManager if you would like.
Once this has been built you can then assign the eLearning courses using an ID to each of the users profile on your "eManager". Allowing you too keep track of each users progress etc.
The "eManager" could also save the users notes/wrong answers/unanswered questions, you could create surveys with a slider rating to rate the user. Honestly the limit is endless.
You can receive the data in two different ways:
(Personal) Either you can request that your users email you requesting a username and you generate a password and send it out to the user.
(Commercial) You build your eManager to recieve the data from the website which isnt too difficult to do.
It will be a long process and to answer your question in a different view practice SQL/PHP that would be your base make sure you can run more advanced query's and can confidently edit your DB etc.
Anymore questions just let me know, thanks.
I am creating a web based portal with web application feature. Millions of target user in my portal.
I want to store all login activity of all users as well as their all activity like add, modify, delete of records.
I am planning to store all information related to activity instead of DATABASE
I want to use XML file to store information.
If I am storing all information in Database then database size will be very high.
If I am wrong then give me some other suggestion.
I am using PHP and MySQL in my application.
I have a REST API built with PHP & MySQL for storing ads.
My database contains an ads table and a users table.
I want to bump the view count for every ad once a user is viewing it (only once, if he views it for the 2nd time, I don't want to raise the count again).
Also, how do I go about setting up a service for the client? Psychologically it seems very aggressive to make a server request each time a user is watching an ad, it could get to hundreds of requests at the same time.
Thanks! :)
I'm wanting to get all the photos (and their respective comments, likes, and tags) from all the albums of a Facebook page (which is public). And while I know how to get this information using the Facebook API (or using FQL), my question is more about if I should store this information or not.
I have two options as I see it: create a query for the Facebook API and display the results accordingly OR have a CRON job run a PHP script which updates my database every few minutes and pull information from that database.
Facebook does have the ability to do real-time updates of information; however, as of right now, they don't support photos or albums so this is clearly not an option.
To let you know the amount of information I'd be dealing with, the page I'd be working with contains roughly 40 albums (and counting) and most albums contain roughly the maximum of 200 images. That's a lot of information! If anybody has experience with caching results of API calls, then I'd greatly appreciate your input. Thanks!
cereallarceny
I am working on something very similar to feed albums and photos for pages and apps. I only need to ever store the id of the album and photo, everything else i can pull live from graph api from the 2 id's.
a big page "lets say walmart" https://shawnsspace.com/plugins/TimeLinegallery.php?pageid=walmart has about 350 albums and roughly 20 photos in most of the albums.
I estimate their entire gallery to be over 3 gig, not including comments and like data, and resized images.
In my opinion it is alot less taxing to feed the information per user than it is to cache all the info, with can / will change daily even hourly.
About cron jobs - you have a mau limit on your app for api requests per day. to poll all the albums and photos for your est: sized gallery could put above that limit, and Facebook would remove your access until that mau limit resets or until you get written permission to go over the limit.
If you exceed, or plan to exceed, any of the following thresholds
please contact us as you may be subject to additional terms: (>5M MAU)
or (>100M API calls per day) or (>50M impressions per day).
On a second NOTE: all the objects you need have an last updated_date: you could make the calls and just cache what has a date that is != to the last updated_date. This will reduce your transfer time and size considerably but will not reduce the number of calls needed to check.
Since most of this content could change on a regular basis, such as likes, tags, and additions, storing it wouldn't be a great idea. Are there any specific reasons you need to store it? Fetching it from Facebooks CDN would also be faster. What you could do is store the links and such, but I would download the actual pictures and connections real time.
Assuming that I have a simple website where users can upload photos and they follow each other.
If I follow someone I can see all the updates the users I follow do. Period.
So the user dashboard will show the recent photo uploaded. Simple as that.
My question is:
I was wondering if is better to just query the photo table for where the photos have user-id of people I follow, or create an activity table where I can store the upload etc, or is this redundant?
Well, I suppose it depends how many different kinds of updates you want to show your users. If photos are the only thing you want to display, it's probably not worth making a separate table.
But if you have lots of different kinds of updates then it could be worth making an "updates" table to aggregate them so you only have to query one table to display the dashboard.