sql triggers and technical debt [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
My boss asked me to create a easier system for finding points by having points associated with the user table in our mysql database. The old system just had events, there point values, then another table with events completed for a user, and then another table for just admin given points. So my job was to add these all together and put them in a column. Now he says the problem is that there is still all the queries running around adding points, but instead of changing them to simply add points to the users column upon task completion, they suggested i use a trigger to simply add points to the users column, when one of the other columns has points added to it.
To me this sounds like using a work-around and creating technical debt. Am i wrong?
Im new to the system, and i dont know exactly where all the queries are in the php pages, but if this is creating technical debt, what would be the appropriate way to fix this.
Im new and am probably going to just use sql triggers as to not go against my boss's suggestions, I want to at least know the smart/best way to do things.
Doing my best to provide not actual, but near actual db schema
EVENT: ID, point value, Desc
User-Events: USERID, EVENTID, COMPLETION-STATUS
GIVEN-POINTS:USERID, POINTS_GIVEN, DESC (Each time points are given, so its more of a log than updated points)
I added a Points column to the basic USER TABLE
the trigger would be when user-Event completion-status =done, find point value, add to points in user, instead of changing queries to do that.

Triggers are a perfectly valid way to accomplish what you are trying to do, as long as the business rules are fairly simple.
There are lots of ways to accomplish moving data from one table to another. You can use triggers, some sort of synchronous PHP process or an asynchronous process using some sort of message queue.
Triggers have the benefit of being simple and fast to code, maintain, and run. The upside is that you only have to do the code once, which is especially nice since you don't know where all the queries that touch these tables are. The downside is that you could be putting business logic into the database, which is where you might start getting into technical debt. The other downside is simply that you've added another business layer, which might not be obvious to the next developer, so they might spend a lot of time trying to figure out how and why the summary table is being updated. Comments are a good thing, in this case.
Synchronous PHP processes are are nice in that it's very obvious where the code is being executed. The other upside is that you have access to the whole PHP application context and can create more complex business rules. The downside is that you will have to put the function or method call into each place where the table is potentially being touched.
Asynchronous PHP processes have the same up and downsides as the synchronous PHP processes, with the added benefit that they aren't going to slow down the user experience. They are also a little more complex to create; you have to handle cases where the messages aren't received, or aren't received in the correct order.

Related

PHP open file on server with get variables? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Okay, basically, I have a PHP script written all up (it's an MMORPG, so we're clear).
What I'd love to be able to do is rather than writing new (massive) files that contain the exact same data, just in one big script (so as to be able to create more of an NPC aspect to the game), I'd rather just be able to send a request to open page.php using predefined get variables (i.e., collect=Y or attack=Y etc) that would be virtually identical to how a real player send the requests, and have the system open the file, run through the file, and make whatever queries to the database it needs to before closing it.
I'm confused on how fopen works to be honest, some things I've read make me believe the above is possible, others not so much.
Any help would be appreciated.
I'm going to go out on a limb, here, and try to solve your problem (as I understand it) rather than answer your question (as you have cast it).
Your fundamental problem is that you have treated PHP files as complete units of code, with input from the query string, processing specific to that file, and output back to the user. This violates the "single responsibility principle", because there are at least three top-level responsibilities here:
Processing user input and deciding on the appropriate action
Performing an action, including manipulation of database structures
Communicating the result of an action back to the user
These can all be broken down into smaller tasks - for instance, the nitty-gritty of connecting to the database should kept out of the more abstract actions, because changing how an enemy moves, and changing that enemy to be stored in a MongoDB document rather than a MySQL table should not require changes to the same code.
The solution, therefore, is to embrace structured programming, which in modern PHP (and many other languages) usually means embracing object-oriented programming. So, at a first level of organisation, you might have:
A class for looking at the query string, checking that it makes sense, and creating an abstract list of actions described by it.
Classes representing certain actions such as "Attack" and "Collect", which take details of what is being attacked or collected from the abstract list, and return a different abstract list detailing the results.
Classes representing the player, and enemies or objects within the game, which can be used by the action classes to calculate the outcome in different situations.
Classes for taking the result of actions and displaying them to the user.
Now, instead of saying "I need to create a query string, run the code the page would run, then take the output and use it somehow" you can say "I need to create an action list, run the appropriate actions, and use the result list somehow".
It may sound like that's a lot more work than just forcing PHP to run the existing code, but the power it gives you to create new combinations of existing functionality is not to be under-estimated.

MySQL: HOW TO MOVE OLD DATA TO DB [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I finished creating an accounting web application for an organization using codeignter and mysql db, and I have just submitted it to them, they liked the work, but they asked me how they would transfer their old manual data to the new one online, so that their members would be able to see their account balances and contributions history.
This is a major problem for me because most of my tables make use of 'referential integrity' to ensure data synchronization and would not support the style of their manual accounting.
I know a lot of people here have faced cases like this and I would love to know the best way to collect users history, and I also know this might probably be flagged as not a real question, but I really really have to ask people with experience.
I would appreciate all answers. Thanks (And vote downs too)..
No matter what the case is, data conversions are very challenging and almost always time consuming. Depending on the consistency of the data in question, it could be a case that about 80% of the data will transfer over neatly if you create a conversion program using PHP. That conversion code in and of itself may be more time consuming than it is worth. If you are talking hundreds of thousands of records and beyond, it is probably a good idea to make that conversion program work. Anyone who might suggest there is a silver bullet is certainly not correct.
Here are a couple of suggested steps:
(Optional) Export your Excel spreadsheets to Access. Access can help you to standardize data and has tools in place to help you locate records which have failed in some way. You can also create filters in Access if you need to. The benefit of taking this step, if you are familiar with Access, is that you have already begun the conversion process to a database. As a matter of fact, if you so desire, you can import your MySQL database information into Access as well. The benefit of this is pretty obvious: You can create a query and merge your two separate tables together to form one table, which could save you a great deal of coding.
Export your Access table/query into a CSV file (note, if you find it is overkill or if you don't have Access, you can skip step 1 and simply save your .xls or .xlsx file to type .csv. This may require more legwork for your PHP conversion code but that is probably a matter of preference. Some people prefer to avoid Access as much as possible, and if you don't normally use it you will be wasting time trying to learn it just to save yourself a little bit of time).
Utilize PHP's built-in str_getcsv function. This will convert a CSV file into a PHP array.
Create your automated program to parse through each record. Based on the column and its requirements, you can either accept or reject records. You can then export your data, such as was done in this SO answer, back to CSV. You can save two different CSV files, one with accepted records, and one with rejected records.
With rejected records, which are all but inevitable when transferring from a spreadsheet, you will need to have a course of action. The simplest way for your clients is probably to give them a procedure to either manually import records into the database, if you've given them an interface to do so, or - probably simpler but requiring more back-and-forth - to update the records in Excel to be compliant with the new system.
Edit
Based on the thread under your question which sheds more light on what you are trying to do (i.e., have a parent for each transaction that is an accepted loan), you should be able to contrive a parent field, even if it is not complete, by creating a parent record for each set of transactions based around an account. You can do this via Access, PHP, or, more likely, a combination.
Conclusion
Long story short, data conversions take time. If you put the time in up front, it will be far easier to maintain a standardized series of information in the long run. If you find something which takes less time in the beginning, it will mean additional work for you in the long run in order to make this "simple" fix work over time.
Similarly, the closer you can get legacy data to conform to your new data, the easier it will be for your clients to perform queries etc. While this may mean that some manual entry will be required on the part of you or your client, it is better to inform the client of the pros and cons of each method fully and let them decide. My recommendation would always be to put extra work in at the front-end because it almost always ends up cheaper than having to deal with a quick fix in the long run, but that is not always practical given real world constraints.

PHP 'most popular' feature on blog [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
i have a blog system which has articles inside a database, now i want to build a feature where it displays five of the most popular articles in the database, according to how many views it gets.
Is there any sort of technology out there which i can take advantage of where it states how many views a page has received, and IS able to be integrated into a database.
Or perhaps there is a better internal method of doing something like this?
Thanks in advance.
EDIT: If you are going do down vote my thread randomly, at least tell me why.
You have three choices as an approach for this obviously:
you collect the usage count inside your database (a click counter)
you extract that information from the http servers access log file later
you could implement a click counter based on http server request hits
Both approaches have advantages and disadvantages. The first obviously means you have to implement such a counter and modify your database scheme. The second means you have asynchronous behavior (not always bad), but the components depend on each other, so your setup gets more complex. So I would advise for the first approach.
A click counter is something really basic and typical for all cms/blog systems and not that complex to implement. Since the content is typically generated dynamically (read: by a script) you typically have one request per view, so it is trivial to increment a counter in a table recording views of pages. From there your feature is clear: read the top five counter values and display a list of five links to those pages.
If you go with the second approach then you will need to store that extracted information, since log files are rotated, compressed, archived and deleted. So you either need a self tailored database for that or some finished product. But as said: this approach is much more complex in the end.
The last option is nothing I saw, it just sprang to my mind. You could for example use phps (just as an example) auto append feature to run a counting routine in a generic way. That routine could interpret the request url, decide if it was a request to an article view. If so it could raise a click counter, typically in a small database, since you might have several requests at the same time, which speaks against using a file. but why make things that xomplex? Go with the first option.

What is the best way to create dynamic time-based events like building construction times,etc? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am interested in creating a web game (in the likes of travian.com). I have some knowledge in php, sql and javascript. My question is this:
What is the best way to create dynamic time-based events like building construction times,etc?
I am specifically speaking of events that automatically change database after a certain period of time. From what I can gather one could use chron to query the database for updates on buildings constructions times and update them if, for example, the building time it takes + the time construction started = current time.
This would be one solution (which I don't like as most webs ervers only allow for 1 minute interval chrons).
Another solution seems to be using mysql events in which every time a user asks for a building to be constructed we would create an object on the database and update it after the event is fired. This to me seems to be the most intuitive answer.
I don't, however, have enough experience or know-how to figure out if there is a better solution or if these solutions are in any way viable for such "large" purposes.
If anyone could give me some lights on the issue or maybe point me in the correct direction I would me most appreciated. I apologize in advance as English is not my first language.
I believe the answer here depends on whether the updates you need to make are time-driven or event-driven, and on whether they are exclusively updates to other parts of the database.
If they are purely event-driven and purely updates to the database, then SQL Triggers can be a viable option. Even then, opinions in the industry are mixed about whether using Triggers is a good practice, as it can make it difficult for future developers to find the cause of a change. Another downside to triggers is that it can lead to slower performance to the user at the time the update is made (since the database has more complicated work to do than the initial update would have been).
If the updates are time-driven, or they include updates to anything outside of the scope of the database itself (e.g. sometimes updating the filesystem), or you want to avoid Triggers to simplify maintenance for future developers, then the cron-job approach is a great option. The one gotcha here is that a large collection of updates scheduled via cron job can result in a performance drain, so you will need to mitigate that when implementing the job (choosing an off time is one option here, as is implementing a queue system to run the updates, so that the cron job only issues a certain number of updates before letting other database requests go through).
Personally, I prefer a cron-job + queued update process.

data system design

Need some ideas/help on best way to approach a new data system design. Basically, the way this will work is there will be a bunch of different database/tables that will need to be updated on a regular (daily/weekly/monthly) basis with new records.
The people that will be imputing the data will be proficient in excel. The input process will be done via a simple upload form. Then the system needs to add what was imported to the existing data in the databases. There needs to be a "rollback" process that'll reset the database to any day within the last week.
There will be approximatively 30 to 50 different data sources. the main primary interface will be an online search area area. so all of the records need to be indexed/searchable.
Ideas/thoughts on how to best approach this? It needs to be built mostly out of php/mysql.
imputing the data
Typo?
What you are asking takes people with several years formal training to do. Conventionally, the approach would be to draw up a set of requirements, then a set of formal specifications, then the architecture of the system would be designed, then the data design, then the code implementation. There are other approaches which tend to shortcut this. However even in the case of a single table (although it does not necessarily follow that one "simple upload form" corresponds to one table), with a single developer there's a couple of days work before any part of the design could be finalised, the majority of which is finding out what the system is supposed to do. But you've given no indication of the usage nor data complexity of the system.
Also what do you mean by upload? That implies they'll be manipulating the data elsewhere and uploading files rather than inputting values directly.
You can't adequately describe the functionality of a complete system in a 9 line SO post.
You're unlikely to find people here to do your work for free.
You're not going to get the information you're asking for in a S.O. answer.
You seem to be struggling to use the right language to describe the facts you know.
Your question is very vague.

Categories