Optimize DB connectivity for AngularJS app - php

I have an AngularJS v1 app that connects to Oracle DB and takes the values from it.
At the beginning, each page had just several such values, so the optimization was not in question. But now I have pages, that contain dozens of elements with values taken from DB.
Currently, for each element, the name of the element is passed to PHP file that opens a connection to DB, reads the last value (using rownum and order by time) and returns this value back to AngularJS.
So, as you can imagine, it takes quite a while to display all dozens of values on a page (due to the fact that AngularJS first loads all the values and then displays them all together).
I would like to somehow optimize this connectivity.
The example of a page can be seen at: NP04 Photon Detectors
And the code can be found here: Github
The PHP file in question is: app/php-db-conn/elementName.conn.php
Thanks!
UPDATE
I have updated the code to partly retrieve data using the array, but it seems to me that I'm forming JSON in an incorrect way. Could someone please help me out?
UPDATE 2
Managed to make it all work, but now it takes even longer to load than before despite the fact that it transfers less data.

Related

How bad it is to copy paste my own code blocks?

! Actually, I am learning PHP from last couple of months and now I am in a stage where I can program small things like a simple Login Page in PHP and mySQL or a Contact Form. I have wrote a lot of codeblocks like inserting something into a database or selecting something from a database etc. etc. But, I always copy paste my own code-blocks from previous projects while working on a new one. So, I want to know whether this tendency is unique to me only or each of the beginner passes through the same phase once during their journey of being a developer?
Please bear with me because I know this isn't really a programming question and doesn't worth your time as well. I tried finding out in Google as well but this is a snap of what I found:
I mean to say that most of the search results dealt with copy pasting other's code which is not the case of what I am talking about. In order to save time I do copy paste my own code blocks almost everytime. So, how bad is this behaviour of mine?
I again apologize for not posting a question that is worth your time but I am finding it hard to learn to code by myself without having any mentor nearby ( Actually, I searched for a mentor who could teach PHP before giving it a start all by myself, but I found none in my area ) for clearing my doubts and as such Internet is the thing which I mostly depend upon for learning about anything.
This question probably belongs on https://softwareengineering.stackexchange.com but I'll try to give you a decent answer and some guidance.
People re-use their own code all the time. You do not however want to copy/paste if possible. The issue with copy/paste is when you have something used more than a few times - like a MySQL database connection - and it needs updating. I'd rather modify one file (or one small group of files) and have all of my webapps fixed/updated than having to modify 2 or 3 database calls in 9 different web apps...
For things that I use everywhere/all the time - talking with our course management systems API, authenticating a user against our LDAP server, connecting to a MySQL database and running queries, processing forms that are emailed, etc - I've built up my own (or coworkers have) sets of functions, classes, etc. Which I then keep in a single directory, and can include as needed.
If you do this, you want your functions/object methods to be as generic as possible - for example, my MySQL query function takes several arguments - an associative array with connection info (since we have several DB servers based on purpose), a query, and an array of parameters. It returns an array with a status code, and then appropriate data - the record set result for inserts, the ID of the last insert, the count of rows affected (for delete/update). This one function handles 50+ queries and connects to 4 different MySQL servers.

Getting all data once for future use

Well this is kind of a question of how to design a website which uses less resources than normal websites. Mobile optimized as well.
Here it goes: I was about to display a specific overview of e.g. 5 posts (from e.g. a blog). Then if I'd click for example on the first post, I'd load this post in a new window. But instead of connecting to the Database again and getting this specific post with the specific id, I'd just look up that post (in PHP) in my array of 5 posts, that I've created earlier, when I fetched the website for the first time.
Would it save data to download? Because PHP works server-side as well, so that's why I'm not sure.
Ok, I'll explain again:
Method 1:
User connects to my website
5 Posts become displayed & saved to an array (with all its data)
User clicks on the first Post and expects more Information about this post.
My program looks up the post in my array and displays it.
Method 2:
User connects to my website
5 Posts become displayed
User clicks on the first Post and expects more Information about this post.
My program connects to MySQL again and fetches the post from the server.
First off, this sounds like a case of premature optimization. I would not start caching anything outside of the database until measurements prove that it's a wise thing to do. Caching takes your focus away from the core task at hand, and introduces complexity.
If you do want to keep DB results in memory, just using an array allocated in a PHP-processed HTTP request will not be sufficient. Once the page is processed, memory allocated at that scope is no longer available.
You could certainly put the results in SESSION scope. The advantage of saving some DB results in the SESSION is that you avoid DB round trips. Disadvantages include the increased complexity to program the solution, use of memory in the web server for data that may never be accessed, and increased initial load in the DB to retrieve the extra pages that may or may not every be requested by the user.
If DB performance, after measurement, really is causing you to miss your performance objectives you can use a well-proven caching system such as memcached to keep frequently accessed data in the web server's (or dedicated cache server's) memory.
Final note: You say
PHP works server-side as well
That's not accurate. PHP works server-side only.
Have you think in saving the posts in divs, and only make it visible when the user click somewhere? Here how to do that.
Put some sort of cache between your code and the database.
So your code will look like
if(isPostInCache()) {
loadPostFromCache();
} else {
loadPostFromDatabase();
}
Go for some caching system, the web is full of them. You can use memcached or a static caching you can made by yourself (i.e. save post in txt files on the server)
To me, this is a little more inefficient than making a 2nd call to the database and here is why.
The first query should only be pulling the fields you want like: title, author, date. The content of the post maybe a heavy query, so I'd exclude that (you can pull a teaser if you'd like).
Then if the user wants the details of the post, i would then query for the content with an indexed key column.
That way you're not pulling content for 5 posts that may never been seen.
If your PHP code is constantly re-connecting to the database you've configured it wrong and aren't using connection pooling properly. The execution time of a query should be a few milliseconds at most if you've got your stack properly tuned. Do not cache unless you absolutely have to.
What you're advocating here is side-stepping a serious problem. Database queries should be effortless provided your database is properly configured. Fix that issue and you won't need to go down the caching road.
Saving data from one request to the other is a broken design and if not done perfectly could lead to embarrassing data bleed situations where one user is seeing content intended for another. This is why caching is an option usually pursued after all other avenues have been exhausted.

Require best Ajax session array pagination

I am using php mysql.
First on form submission I run query on mysql and get the row id -> store it in session array.
For displaying the result I have modified one php array pagination script to create pagination of session array with first,prev,next,last, Jump to page number functionality. For that pagination I have taken the reference from:
http://lotsofcode.com/php/php-array-pagination.htm
Script is working fine.
But I have two questions.
Question 1: Is it fine to store big result in session array? if not
what will be the good alternative for that? (I am wondering to keep
first 500 result in session array and if result result is more than
500 then create one xml file.)
Question 2: Is it possible to use ajax for pagination working with
session array and having functionality of first,prev,next,last, Jump
to page number?
If any solution is there please update me.
Thanks
Ravindra.
I wouldn't use sessions to store large amounts of data, and I wouldnt use sessions to store data returned from a table (mysql or otherwise).
The session data is stored on the server - and held within memory - whats going to happen when you get multiple users using the same table ?
The database (mysql or otherwise) should be sufficient enough, with the correct indexes in place, to handle queries for displaying data.
I have a mysql table with 120 million records and can extract data quickly using date ranges - no speed issues what so ever.

Inserting several hundred records into mysql database using mysqli & PHP

I'm making a Flex 4 application and using ZendAMF to interact with a MySQL database. I got Flex to generate most of the services code for me (which utilizes mysqli) and roughly edited some of the code (I'm very much a novice when it comes to PHP). All works fine at this point.
My problem is - currently the application inserts ~400 records into the database when a user is created (it's saving their own data for them to load at a later date) but it does this with separate calls to the server - i.e. each record is sent to the server, then saved to the database and then the next one is sent.
This worked fine in my local environment, but since going on a live webserver it only adds these records some of the times. Other times it will totally ignore it. I'm assuming it's doing this because the live database doesn't like getting spammed with hundred of requests at practically the same time
I'm thinking a more efficient way would be to package all of these records into an array, send that to the server just the once and then get the PHP service to do multiple inserts on each item in the array. The problem is, I'm not sure how to go about coding this in PHP using mysqli statements.
Any help or advice would be greatly appreciated - thanks!
Read up on LOAD DATA LOCAL INFILE. It seems it's just what you need for inserting multiple records: it inserts many records from a file (though not an array, unfortunately) into your table in one operation.
It's also much, much faster to do multiple-record UPDATEs with LOAD DATA LOCAL INFILE than with one-per-row UPDATEs.
ee you should handle the user defaults separate and only store the changes
and if they are not saved you must check warnings form mysql or errors form php, data don't disappear
try
error_reporting(-1);
just before insering
and
mysqli::get_warnings()
aferwards

Dashcode mysql datasource binding

Hi I've got a tricky question (aren't they all tricky?)
I'm converting a database driven site that uses php, to a site being built with Dashcode.
The current site selects data held in a mySQL database and dynamically creates the page content. Originally this was done to reduce site maintenance because all the current content could be maintained and checked offline before uploading to the live database, therefore avoiding code changes.
In dashcode you can work from a JSON file as a datasource - which is fine, it works - except for the maintenance aspect. The client is not willing (and I understand why) to update several hundred lines of fairly structured JS object code when the database holds the data and is updated from elsewhere.
So - What's the best way to get Dashcode to link to the database data?
Where are you getting the the JSON from? Is that being generated from the original MySQL? Could you not generate the JSON from the MySQL and therefore use the original maintenance procedure prior to uploading to MySQL?
For my projects I usually create a php intermediate that when accessed logs into the MySQL database and phrases the results into xml in the body of the page. Just point daschcode to the php file in the data source. Parameters can even be passed into the php script through with GET through the url in the data source.

Categories