I'm using a foreach function to import a 30mb CSV file into MySQL. This script need to run for about 2-5 minutes and I'm already using the ob_flush.
Now my question is:
Is there any other option to give the user an indication of the loading progress? At this point you never know when the script will be fully load.
I'm using a foreach function to import a 30mb CSV file into MySQL. This script need to run for about 2-5 minutes and I'm already using the ob_flush.
The best advice is not to use a foreach loop to import CSV into MySQL at all.... MySQL provides a built-in feature for importing CSV files which is much quicker.
Look up the manual page for the MySQL LOAD DATA INFILE feature.
Seriously, when I say "much quicker", I mean it -- switch your code to use LOAD DATA INFILE, and you won't need a progress bar.
I am agree with #SuperJer, you must use AJAX.
Everyday I importing 50MB data because we have ecommerce website. We used AJAX on left side and put a loader which show "Files uploading....".
Foreach is good if you have less data, but for huge data I think its not good. Always use inbuilt methods or function.
To upload csv file you may use below syntax
LOAD DATA INFILE 'c:/your_csv_file.csv' INTO TABLE tablename.
One of the easier ways to do this would be to call the function that does the work at the end of the page, AFTER all the page HTML has been sent to the browser.
Assuming jQuery, make note of the element in which you want the progress indicator to appear, such as id='progressBar'.
Then, as you iterate through your loop, every 100 iterations or so, echo a javascript line to update that element with the new percentage:
echo "
<script type='text/javascript'>
$('#progressBar').html('".(int)$percentage."');
</script>";
You'll want to make sure, depending on the number of iterations, to not overwhelm the browser with javascript code blocks. Perhaps check if the percentage is divisible by 5, prior to echoing the block.
Also, if you are looping through a CSV and INSERTing it line-by-line, it would be better (faster) to insert a block of them, say 500 lines per INSERT. This is also easily displayed using the same method.
Related
I tried to load a 16MB file, into an php array.
It ends up with about 63MB memory usage.
Loading it into a string, just consumes the 16MB, but the issue is, I need it inside of an array, to access it faster, afterwards.
The file consists of about 750k lines (routing table dump).
I proberly should load it into a MySQL database, issue there, not enough memory to run that thing, so I did choose rqlite: https://github.com/rqlite/rqlite. Since I also need the replication features.
I am not sure if a SQLite database is fast enough for that.
Does anyone got an Idea for that issue?
You can get the actual file here: http://data.caida.org/datasets/routing/routeviews-prefix2as/2018/07/routeviews-rv2-20180715-1400.pfx2as.gz
The code I used:
$data = file('routeviews-rv2-20180715-1400.pfx2as');
var_dump(memory_get_usage());
Thanks.
You may use the Php fread function. It allows reading data of fixed size. It can be used inside a loop to read sized blocks of data. It does not consume much memory and is suitable for reading large files.
If you want to sort the data, then you may want to use a database. You can read the data from the large file one line at a time using fread and then insert it to the database.
I have the following problem: I upload excel files with a form, and on submit process them for 5 minutes server side with a Background process.
Now, I want to create a snapshot of the excel file and display it to the user, what I already do, but opening the file with PHPExcel is usually very slow, and I need to make that process faster, for the sake of usability.
To be clear, if I click "preview" it may take 10, 20, 30 seconds, or the ajax request simply die. Sometimes I use reduced versions of the excel (Open them, remove 50k rows, and save again with 100 rows) for testing purposes, and then the preview is shown in no time.
What I want to do is do the same with php server side. I mean, opening the excel, remove 50k rows, save again, and then send the preview back.
Using PHPExcel doesn´t help at all, it may achieve what I want, but again, the time is not acceptable.
Is there any way I can do somnething like:
$excel_info = file_get_contents($file);
//USE SOME REGEX OR RULE TO REMOVE COLUMNS, OR OTHERWISE, EXTRACT ONLY SOME ROWS
$first10ColumnsInfo = customFunction($excel_info);
file_put_contents("tmp/reduced_excel.xlsx", $first10ColumnsInfo);
I tried to look into PHPExcel libraries to get an idea of how did it handle the data, and try to do something similar but at some point, I simply got lost, after I could retrieve some info, but not properly formatted.
Thank you in advance
I am looking for the best way to export a CSV file. With MySQL and PHP.
Currently Im generating an CSV with INTO OUTFILE, it works that way but I don't think it's the good way.
Isn't there a better option to make a CSV export download button for every moment a user clicks the download button?
A INTO OUTFILE export is only possible for one instance and is not overwritable.
I have to generate a timestamp and save the file, and then get the latest file from my directory.
This method looks a bit messy for downloading a CSV file from a server...
Has anyone got better solutions?
Thanks!
I think you are well off with exporting via INTO OUTFILE. The reason is that sending the content to the CSV file is done by the MySQL server. Doing it with the PHP Script would be slower (first of all because it is a script, second of all because the data from the SQL server need to be passed to the script) and cost you more resources.
If the CSV file(s) become large you should keep in mind that your Script still may expire. You can encounter this issue by either setting an higher value for the maximum running time of a script in the configuration or have the CSV file being created by an external process/script
Maybe something like this:
`echo $query | mysql > $unique`;
$contents = file($unique);
I have a 100.000 rows txt file and I need to read it in order to insert most part of it into my DB.
I'd like to use this plugin, as I found it very easy to use:
http://www.bram.us/projects/js_bramus/jsprogressbarhandler/#download
My problem is: I read the txt file with PHP, but I don't understant how to update the progress bar!
I was thinking something like this
echo '$("#progressbar").progressbar({ value: '.($k++).' });';
where $k goes from 0 to 100, but, WHERE do I have to put it??
I've found this method:
http://spidgorny.blogspot.it/2012/02/progress-bar-for-lengthy-php-process.html
I think that this could help with some editing to the code.
You can't possibly mix the php and the javascript:
The PHP will run and generate an HTML/JS file
The HTML/JS file will be sent to the client
The client will run the JS: $("#progressbar").progressbar({ value: XX });
So k will be static.
--
If you really want to do something like that easily, you could use an intermediary DB table with three columns: txt_file, position, length
And often update this table while the PHP script is running over a txt file.
Client side, in Javascript, you can make an ajax request using jQuery for example every 5 or 10seconds, which is going to call an other PHP page, and this PHP page will only return the corresponding row from the intermediary table. Once you have the result you can update the progressbar.
--
It's the simplest solution to implement for you, but it's still really dirty, and the parsing of the txt file better have to be really long !
There is no direct method to achieve this thing. PHP script executes first and then the output is sent to the client viewing the web page that is why you cannot show live status of your PHP script's processing to the client.
You will have to use a combination of AJAX and Database :
Create a table to track the progress of the loading of the text file. Whenever a user (client) sends a request to the page, keep on updating the table with the progress. Use Session id as the index on the table so that it would be easy to track the progress per client. Now use AJAX requests to get the progress from the table and present it to the client with your progress bar.
Let me describe what I've made ar first:
I have to import large ammount of data from different xml's to my database and because it last a lot I had to put a progress bar and I did it like this: I split the whole import into tiny little AJAX requests and I import little data at a time (when an ajax request completes the progress bar increases a bit). This whole idea is great but the data just keeps getting bigger and bigger and I can't optimize the code anymore (it's as optimized as it gets).
The problem is that everytime I do a AJAX call I lose a lot of time with things specific to the framework (model initializations and stuff), with the browser handling the url and so on. So I was wondering if I could use the flush function from php.
But I've been reading that the flush function doesn't work great on all browsers (which is weird cause it's a server-side function). If I would use the flush function I would just write <script>increase_progressbar</script> or whatever I want and I could do it.
So, any opinions on the flush function? I've been testing it on little scripts but I want to know if someone really used it with big scripts. Also, I can listen to any other suggestion of doing what I want to do :)
I wont give you direct advise, but I will tell you how I did it in one of my projects. In my case I need to upload an Excel files and then parse them. The data exceeding 3000 rows and I had to check all columns of each row for some data. When I parse it directly after the upload, the parser often crashes somewhere and it was really not safe.
So how I did it? The upload process has been split in 2 parts:
Upload physically the file (regular upload field and submit). When the button is clicked some CSS and JS "magic" hide the form and one nice loading bar appears on the screen. When the upload has been done the page just refreshes and the form appear again for the next file
Start parsing the data on the background using php-cli as #Dragon suggest with exec().
In the database I had a table which stores information about the files and there is a boolean field called "parsed". When the parser finishes the job, the last task is to update that field to true.
So here is the whole process from user point of view:
select a file and upload it.
wait until the file has been uploaded on the server. Till then a message and loading bar appear indicating that something is working. The upload form has been hidden with CSS and JS, so preventing user to upload another file.
When it's over the page has been refreshed (because I did normal _POST submit) the form appear on the screen again as well as a list of recently uploaded files (this I've stored this in the session).
In each of the nodes of that list I had an indicator (an icon). In the first time it's a spinner (ajax spinning wheel).
On a regular basis (30 sec or 1 min) I've checked the file table through Ajax call and reading the parsed field. If the background process has been over, the field was set to true and with some JS and CSS I've changed the icon to "Done". Otherwise the spinner remain.
In my project I doesn't have requirement to show extra details about the imports, but you can always go wild with other extra data.
Hope this help you with your project.