I am creating a web based portal with web application feature. Millions of target user in my portal.
I want to store all login activity of all users as well as their all activity like add, modify, delete of records.
I am planning to store all information related to activity instead of DATABASE
I want to use XML file to store information.
If I am storing all information in Database then database size will be very high.
If I am wrong then give me some other suggestion.
I am using PHP and MySQL in my application.
Related
I would like to create e-learning platform. So users will have a lot of things to choose (mostly available to view only for them) like:
add note
add movies to favorite
rate the instructor
And few options that auto save for each user like:
unanswered questions
wrong answer questions
movies in progress (user saw only 2 min from 5)
So what database or method I schould use for store that kind of data?
I do not want to use cookies because it needs to be save on user account and not on browser. User need to have that all on every browser or mobile device.
I wondering about json but...if I do so each user I'd will be available to view...so schould I use MySQL?
I would recommend that you build your own data logger, what i mean by this is build yourself a place to store every users data like an eManager if you would like.
Once this has been built you can then assign the eLearning courses using an ID to each of the users profile on your "eManager". Allowing you too keep track of each users progress etc.
The "eManager" could also save the users notes/wrong answers/unanswered questions, you could create surveys with a slider rating to rate the user. Honestly the limit is endless.
You can receive the data in two different ways:
(Personal) Either you can request that your users email you requesting a username and you generate a password and send it out to the user.
(Commercial) You build your eManager to recieve the data from the website which isnt too difficult to do.
It will be a long process and to answer your question in a different view practice SQL/PHP that would be your base make sure you can run more advanced query's and can confidently edit your DB etc.
Anymore questions just let me know, thanks.
I would like to be able to sync certain tables from a remotely hosted database into a Wordpress database, these tables are not containing any related Wordpress content. These tables contain simple warranty lookup info that returns a simple yes/no if a specific serial number of a product is past the warranty date. I have SSH and privileges set up for this and my IP is whitelisted.
On the management side, the user enters this data into an internal system that pushes the information to their server hosting the database.
My goal is not to disrupt the system they are using and simply use a form on the site to access the lookup feature with the imported data from the external database. I hope this makes sense.
I appreciate any help or guidance in this.
We built a survey tool on top of google forms using wordpress.
Simply, you create a google form, create a private open link, put into a wordpress backend page, then the system processes the module server side and generates the necessary html file. When the user fills the form and sent it, via ajax the server use zend gdata to write the results on the spreadsheet connected to the form et voilat.
But this system is limited, also because google form is quite limited. We want to improve it.
That's why I'm asking your opinions to upgrade the system to have some more features:
We want to be able to keep the form open so that users can fill it in more than one occasion. theoretically then, we need to know which user the spreadsheet rows are connected to. This could be done by saving some sort of ID key to recognize the user, but then we don't know how to refill the fields in the form, since the spreadsheet created from the forms don't retain any sort of key to connect columns and form field.
We need more field types! like a file upload field that put the uploaded file in a specific gdrive folder.
We need to see the data for the single entry while google gives you only the whole spreadsheet that's quite hard to read.
It's not an easy task! Which solutions should we use to solve these problems?
Many thanks!
UPGRADE:
We decided to go by using a mix of google forms, google fusion tables, google charts via api access. Here's the simplified algorithm:
The admin user create his form via google forms and save the url. To have more field type, user can put a tag in the field comment, eg [file] for, well, files upload.
The url is put into an admin page of our system. The page fetch the content of the form page and extrapolate into an array, for every field, the title, the ID, the type and the comment; if there's some tag in the comment, this become the field type.
Using this data, system create if not existing a folder with a fusion table inside. if file fields are present, another subfolder is generated. Addresses of these folders and files is saved.
Using the array data, in the fusion table a column is created for each of the array fields, with a column title of this sort "[field_ID field_type]field_title", plus a column for the end user ID.
The admin user, can more over open or close the form.
When a user goes to the form page, the array is used to generate the form. If the system doesn't have in memory the user ID it means that the user has never filled up the form. Otherwise the system will use the user ID to fetch the data from the fusion table to populate the form.
When the user fills up the form, the entries are feed to the columns using the field ID as reference, plus the user ID. The user ID is also stored in the system the remember that the user already filled the form, as said in point 5. If files are uploaded, they are stored in a gdrive folder.
The admin user therefore can go to the admin page and see how many people has filled up the form, can ask for single user data, for summary data using google charts, can download a pdf of data from single user, every user, or summary.
Of course this is the idea, we have to build it. One first question is whether we should use javascript or php to communicate with google, so doing the processing on the client or server side...
If you're asking about Javascript vs PHP, you should know that the Javascript API can't write to a Google Spreadsheet because of Cross-Domain Security issues.
PHP can as it is a server side language. Zend Framework makes it easy to interact with Google Spreadsheets. http://framework.zend.com/manual/1.12/en/zend.gdata.spreadsheets.html
So go with PHP if that was your question.
I've been playing with the Google+ API PHP starterkit. My ultimate goal is to run a cron job on my server that uses the Google+ API to grab the activity data, then store it in a MYSQL database-- where I can then dynamically update a twitter-like feed on a website.
Is this a possible/practical way of mining a Google+ profile's public stream? If not, what do you suggest as a good alternative?
Sounds like a good way to me.
One thing you have to keep in mind though is that there is no easy way to check if an activity has been deleted after you have added it to your database. You would have to check with $plus->activities->get($activityId) for each of the activities you want to display to see if it still exists (unless of course you don't mind them appearing on your website).
Apart from that your solution will work fine.
Suppose you're developing an independent, small sub-page for a big and well frequented web portal.
The sub-page shows entries from a public event calendar, and allows users to highlight those especially interesting to them. The highlighted events shall be highlighted (and maybe shown on a separate list) on each future visit of that user.
However, building a classical user registration system, or any other way of storing the user-highlighted event picks on the server, is not an option: The sub-module needs to be as self-contained and need as little maintenance as possible. It's one of the conditions of the project.
The only way to do this without building a login system of some sort (as far as I can see) is using cookies or some other local storage (Flash / HTML 5....) which has the obvious and big downside that it's tied to the computer, not the user.
Is there a way of storing a few kilobytes data on a per-person basis, but without having to utilize a login or openID, that I am overlooking? A reliable web service perhaps?
A "key/value" storage service, to which I pass a unique key (one that the user specified) and get the savedvalue in return, would be sufficient. There is no need for real security - the data in question is by no means confidential.
OpenID is not an option: It is not well known enough among the audience of the site.
Facebook would be an option, but I don't think they provide "storage" options like this.
As a workaround, I am contemplating offering the user their event picks as a text file download, that also can be uploaded and turned into cookies on another machine. But that is pretty complicated for the user, and thus not perfect.
We have a similar system on our site, where users can bookmark pages to a planner/wishlist function. The saved items are sent via a webservice and stored on our server, and there is a corresponding get webservice.
We have a 'lazy register' system. The first time a user saves an item, they are asked for their email (but no password, as nothing is confidential). This is hashed and saved locally using a cookie, then used to set/get the saved items. When the user uses a different computer they are again asked for their email.
The key is that a register and a login are the same operation, so there is no need for any password reminders or any reset functionality.
The Google Docs API provides programmatic access to Google Docs, where you can create and store documents and spreadsheets. Your application could have its own Google login, which it uses to create one or more documents per user. These documents could be used to store the user settings.
Provided you can get a unique ID from each user (an email address, or something more secure, perhaps), this should be fairly simple. You can even organize the files into folders—one per user.
Alternatively, you could combine Google Docs with the Google Spreadsheets API, where I have just noticed this rather handy feature:
Tables & Records
Interact with spreadsheets as if they're a database
using Tables and Records.