I'm looking for some advice, to be specific in the generation of data using phpmyadmin and mysql.
I have a website (local) that register ideas from different people, those ideas are registered in a database.
I'm able to retrieve all the data into a table that i have in a reports section, the problem is that when there are at least 1500 records it gets slow (i know the amount of data printed in the table makes it slow).
I generate some charts from that data using charts.js, which is good until there are a lot of records in the webpage. (I tested putting 5000 records, 1500 records, and 2000 (which is the max amount we get with this ideas)).
So, what do I want to know?
What is an efficient way of showing that amount of data to the user without the page getting extremely slow.
What is the best practice to generate charts, from a table that gets its data from database or directly getting the data from the database, if so, which chart framework would you recommend?
I have read about pagination, to retrieve just part of the data by page, but my export to excel use the table data to export it, so it would ruin the exporting option.
I'm really sorry if this is no place for asking this programming question, but I cant think of a way of doing this in an efficient way.
Here is a picture of part of my webpage (in top you can see some charts), (below you can see some records of the table), the middle is just some report generation from the date/week and the export to excel option.
Tech used in the webpage:
PHP
JQUERY
HTML5
CSS3
mysql
Any help will be appreciated.
Related
I have a problem with PDF generation using MPDF 6.
The project consists of many data forms stored in a database.
And we have the reports section where users can generate PDFs from the search results.
Some tables are generating PDFs just fine. Others show the spinner hanging for very long time without a result.
The memory limit is set very high (550M) but still, even small data tables get stuck at the spinner.
What am I doing wrong ?
Thank you in advance !
I do not know if this is the right spot to ask my question perhaps I should do it another community but I got a question regarding implementing a website that deals with a large CSV file as input.
I am willing to program a website that have a CSV file as input to generate all kinds of data on my website. Think of each entry has their own geolocations etc. In the end I want to create an informative website where the data of the CSV file is used.
Now I want to know what approach might be the best to do this. I can imagine that storing this CSV file within a Database and retrieving data with PHP might take forever due to large loading time when the website is making a call. Can anyone tell me what approach might work when working which such large CSV file?
To give you an idea. In my CSV file I have the following data:
#, Soccer Club, Street adress, Highest competition,..
1, Soccer club 1, adress 123, 3th division,..
2, Soccer club 2, adress 456, 6th division,..
etc.
Now I want to create a webste that use this data and create all kinds of marker points on a map. I do know how to code this the only problem is that I am searching for a fast way of reading and using the data without having my website to load a long time to generate this information. I did not try it yet but before I spend to much time on it I was wondering which advise could you give me?
BTW Think of CSV files of the size of 1GB/2GB
I would suggest you to import the CSV in to a database table not store the whole file as is and then run queries on it. Databases are optimised and meant for these things.
A few months ago, my supervisor needed me to create a form to collect some data - at the time, the most sensible thing to do was just to whip together a Google Form, no problem. Now it turns out that the data collection program needs to expand, and it makes more sense to direct the form to a MySQL database (using PHP) instead - again, no problem, already done. However, for reasons I won't elaborate on here, my supervisor still wants that data to go to the original Google Sheet as well.
My question - is it possible to submit form data to two separate sheets when one is a MySQL table and the other is a Google Sheet?
Thanks!
I know that feel bro ;)
There's some resources to handle boss requests:
PHPExcel Class is a library written in pure PHP and providing a set of classes that allow you to write to and read from different spreadsheet file formats, like Excel (BIFF) .xls, Excel 2007 (OfficeOpenXML) .xlsx, CSV, HTML...
You will need an script to process the outputs.
Excel / CSV > MySQL
MySQL > Excel / CSV
If you need directly to work with Google Docs, you will have to follow the previous steps and play with the following project:
php-google-spreadsheet-client is a library that provides a simple interface to the Google Spreadsheet API.
You will need to store data in MySQL and pass it to Google.
Second idea:
If your boss neet it NOW, you can give a chance to Zapier and do it fast. If you don't need a very fast sync time, it can be a good free option.
I hope it helps :)
I've currently got a database with just short of 2000 client locations in Australia. What I am trying to do is to display this data on a heatmap, to be embedded into an existing website.
I've done a heap of looking around, and can't seem to find exactly what I'm after.
http://www.heatmapapi.com/sample_googlev3.aspx
http://www.heatmaptool.com/documentation.php
These are along the right lines of what I want to achieve, however I cannot see these working with data from a mysql database (require the data to be hard-coded, or uploaded through CSV files).
Has anyone come across this sort of thing before, or managed to achieve it?
Both of the examples you provide would potentially work.
With the first you would need to use the data you have to dynamically generate the javascript, or at least the values that go into the javascript.
The second is probably the better option. You would provide a path to the script that would dynamically generate a CSV file.
I am trying to create a world application using jQuery (JS) and PHP. I originally tried doing this by using a MySQL database, which didn't work well - the server got overloaded with database queries and crashed.
This time I want to store the data in a text file... maybe use JSON to parse it? How would I do this? The three main things I want are:
Name
x-position
y-position
The x and y positions are given from the JS. So, in order:
User loads page and picks username
User moves character, the jQuery gets the x and y position
The username, x and y position are sent to a PHP page in realtime using jQuery's $.post()
The PHP page has to find some way to store it efficiently without crashing the database.
The PHP page sends back ALL online users' names and x and y coordinates to jQuery
jQuery moves the character; everyone sees the animation.
Storing the data in the file instead of the MySQL database isn't an option if you want to improve performance. Just because MySQL stores its data in the files too, but is use some technics to improve performance like caching and using indexes.
The fastest method to save and retrieve data on server is using RAM as a storage. Redis for example do that. It stores all the data in the RAM and can backup it to the hard drive to prevent data loss.
However I don't think the main problem here is MySQL itself. Probably you use it in an inappropriate way. But I can't say exactly since I don't know how many read and write requests your users generate, what the structure of your tables etc.
Text files are not the best performing things on Earth. Use a key-value store like Redis (it has a PHP client) to store them. It should be able to take a lot more beating than the MySQL server.
You can store the data in a text file in CSV (Comma separated values) format.
For example, consider your requirements.
1,Alice,23,35
2,Bob,44,63
3,Clan,435,322
This text file can be stored and read anytime, and use explode function to separate values.