For example, I have 200 records in a table . My requirement is I need to fetch those 200 records and generate csv files. Then, it will be send to a specified ftp folder.This is just an exmaple, but in general what is the best way to follow:
Method one: Fetch all the ids in array and get the array count and iterate a for loop and call the function.
$ids_array = array which stores all the id from the related table.
$array_cnt = count($ids_array);
for($i=0;$i<$array_cnt;$i++)
{
generateCsv($ids_array[$i]);
}
Here the generateCsv function will call 200 times and generate 200 csvs
Second Method: Fetch all the ids from the table with comma separated and pass to the function like below
function generateCsv($ids)
{
$qry = "SELECT soem fields from the table where ids IN ('".$ids."')";
// SOME CSV generation code will comes here.
}
So in the above two methods which one is the best one ?
The best way is not to loop at all. Mysql has an excellent functionality built into generate CSV. It's called SELECT INTO OUTFILE
SELECT ... INTO OUTFILE writes the selected rows to a file. Column and
line terminators can be specified to produce a specific output format.
Once you have generated the file. Consider using SFTP or some other secure protocol to transfer the files rather than using FTP which sends passowrds in clear text.
Related
Basically I have coded a PHP script to pull information from a JSON file and store it all in one column inside of my MySQL database, here is the format upon which I am storing the data.
(37.50!03:37:42pm PST)
So i basically have multiple entries of similar results stored inside brackets inside of one column.
Now i want limit the results displayed when i pull that information back from the database and display it on my webpage and i cant figure out how? Is there a simple way?
I have tried using LIMIT in my SQL statement but to my understanding(maybe i am wrong) that is used for limiting the number of rows returned and not used for one unique column.
Thank you for your time.
Honestly, it might be easier to accomplish this using PHP. You haven't posted the details of what the multiple data looks like but guessing it is something like this '(37.50!03:37:42pm PST),(37.50!03:37:42pm PST),(37.50!03:37:42pm PST)'.
When fetching the data you split the data and turn it into an array and with preg_split you can have it only return you a limited number of the split array (in the example it uses 50). Post a comment if this doesn't work or if you are able to clarify the format of what multiple entries looks like in the field.
Example:
$rs = mysqli_query($conn, "SELECT yourColumn FROM your_table order by write_here_your_dec_name desc, your_desc_name");
while ($row=mysqli_fetch_assoc($rs)) {
$parts = preg_split('~,~', $row['yourColumn'], 50, PREG_SPLIT_NO_EMPTY);
foreach ($parts as $part ) {
echo "$part<br>";
}
}
I'm using CakePHP 2.1.3. I have a performance problem of looping a large set of array data returned from find('all'). I want to retrieve a query result row by row to eliminate this expensive array processing. I don't want the result set of array returning from find() or query(). What I'm trying to do is like below:
$db = $this->Model->getDataSource();
$sql = 'SELECT * FROM my_table';
if($result = $db->execute($sql){
$db->resultSet($result);
while($row = $db->fetchResult()){
// do something with $row
}
}
However, I don't want to write the raw query. Is there any Cake function that just builds the query according to the association set and executes it without returning the result set?
[Edit]
I'm currently implementing the above script in controller. My model has no associations and so I don't need to use recursive = -1. It is the whole table fetching for the purpose of CSV export.
The Cake's find() has an internal array processing and the returned result set has to be looped again explicitly. I want to optimize the code by avoiding the array processing of large data twice.
Related issue: https://github.com/cakephp/cakephp/issues/6426
At first be sure, that you only fetch the data you really yreally need. Ideally you get everything you need with $this->YourModel->recursive = -1
Often performance problems arise due to many connected data.
When you have checked this I think a loop would be the best solution where you fetch the desired data in chunks via an between condition. Although I am not sure if this will help you.
Why do you want to go through the whole table? Do you perform some maintenance like e.g. filling a new field or updating a counter? Maybe you can achieve the goal better than by trying to fetch a whole table.
I am making a website for a cars show, i want to store images in the database (just the URL) and what i want to do is for all the images to be added to the same cell in the table.
then at retrieval time, i want to use the explode() command in php so i can seperate each URL and use a loop to display them.
the problem i am facing it i do not know what i should use for a delimiter, i cannot use anything that can be used in windows, mac or Linux which can be used in a file name, and i am afraid of using a system reserved key and cause a problem.
i am also concerned about the data type that will hold this information, i am thinking TEXT is best here but i heard many saying it causes problem.
to be clear, the idea is:
when someone uploads 3 images, the images will be uploaded into a folder, then the names will be taken and put into 1 string (after the directories names are added) with a separator between them that then will be stored in the database.
Then, i take that string, use explode() and store the separated data in an array and use a loop to display an image with the source being the stored data in the array.
i need a special delimiter or another way... can someone help me do this or tell me another way of saving the images somehow without a potential risk! i have seen many website which uses dynamic bullet points (lists) but i was never able to get a code or even an idea of how to do them.
EDIT:
The current way i am using is having 10 rows, 1 for each image.. but the problem here is that the user will not be able to add more than 10 images, and if he has less than 10 images then there will be few empty images being displayed. (i know a solution for the empty images, but it is impossible to add more images..)
You can to use any type of
serialization(serialize, json_encode), when put your array and
deserialization(unserialize, json_decode), when want to use it.
But! I advice you to create a new table for your images, with car_id field, for example.
Then you can just join it and get it all.
It can be CarImages ('id', 'car_id', 'image_file')
Also I recommend to add foreign key constraint on CarImages.car_id->Cars.id,
then your orphaned images will cascade removed, when cars will removed.
Storing of serialized values is always bad idea.
If you can't store one row per image on a separate table for any technical debt reason, then you should json_encode the array on images paths and store the result in database.
Solution one :
Create a table called Images contains 3 columns (id,image_path,user_id) and everytime the user uploads an image insert it into this table, and in your script if you want to display the uploads for a specified user get it by the user_id :
$user_id = 1 // The specified user id;
mysqli_query($connect, "SELECT * FROM images WHERE user_id = '$user_id'");
// And Loop here trough images
Solution Two :
Inserting images paths into one column.
Table files contains 1 column called path
Inserting the values to the files table :
$array = array(
'/path/to/file/1',
'/path/to/file/2',
'/path/to/file/3'
);
foreach($array as $path) {
$path = $path . '|';
mysqli_query($connect, "INSERT INTO files (path) VALUES ('$path');
}
Now display the results :
$query = mysqli_query($connect, "SELECT path FROM files");
$row = mysqli_fetch_assoc($query);
$paths = explode('|', $row['path']);
foreach($paths as $path) {
echo $path . '<br>';
}
If you do not change your database then you should try.I think below link useful for you
json
json-encode serialize
you can use anyone.
If You design Your Tables like
Table-user
user_id
username
another table for user images
Table-images
serial_num
user_id
image_url
then you can store many images for 1 user.here user_id of images table actually the foreign key of user table's user_id.
You are using relational database so it's good for you otherwise you can use nosql database
I'm working with importing CSV files into a database, and it is a multi-step process.
Here are the steps:
Step 1: User uploads a CSV file.
Step 2: User associates the data to a data type. For example, if a record in the CSV contains the following data: John,Doe,jondoe#gmailcom, the user would select firstname from a dropdown box to associate to the data value John, lastname from a dropdown box that associates to the data value Doe, and emailaddress from a dropdown box that associates to the data value johndoe#gmail.com
Step 3: Insert data into database
My questions are the following:
1./ On step 3, I would have in my possession the columns which the user chose and the original data.
Here is what the original data looks like:
$data = array(
0 => array('John','Doe','johndoe#gmail.com'),
1 => array('Foo','Bar','foobar#gmail.com')
);
And here is what my columns chosen from step 2 looks like:
$columns = array('firstname','lastname','emailaddress')
How do I create a sql query that can be like the following:
INSERT into contacts (id,firstname,lastname,emailaddress) values (null,'John','Doe','johndoe#gmailcom')
As you can see, the sql query has the columns chosen in the order that they are within the array and then subsequently the values. I was thinking that since the columns are chosen in the order of the data, I can just assume that the data is in the correct order and is associated to the specific column at that position (for example, I can assume that the data value 'John' was associated to the first position of the columns array, and vice versa).
2./ I was thinking of a possible scenario that when the user does the initial upload of the file, they could potentially send a csv file with the first record having a blank field. The problem is, I determine how many columns to have the user associate to the data based on the number of columns within a csv record. In this case, we have 2 columns and every subsequent record has 3 columns. Well, I'm not going to loop through the entire set of records to determine the correct number of columns. How do I resolve this issue? Any ideas?
EDIT
I think I figured out the answer to question 2. On the parsing of the csv file, I can get a count for each record and the highest count at the end of the parsing is my count. Seems right? Any issues with that?
To parse the data from the CSV file, look at fgetcsv. http://php.net/manual/en/function.fgetcsv.php
It'll load a line from the file and return an array of the CSV fields.
$data = array();
while (($lineFields = fgetcsv($handle)) !== false) {
$data[] = $lineFields;
}
This assumes you are using PHP5 and opened the file with $handle. In PHP4 fgetcsv needs a second parameter for max length of line to read.
For the query:
$sql = "INSERT into contacts (id," + implode(',', $columns) + ") values";
I'm not including anything after the values. You should be creating prepared statements to protect against sql injections. Also if you are using MySQL, id should be an autoincrement field and omitted from inserts (let MySQL generate it). If you are using Postgres, you'll need to create a sequence for the id field. In any case let the database generate the id for you.
I built a document upload admin screen, where my client can browse and upload pdf documents to a mysql dbase.
I have two separate tables for the Agendas, and one for Minutes. both have separate table names "upload" and "upload_mins".
on the index.php page, I have the page fetch each row of the database, and display all of the valid documents on a "download" page.
I have come a across a problem.
Each dbase is set to increment ID. now that there is two databases, they are coming to utilize the same ID.
so I am pulling from the Agenda table:
http://www.example.com/clerk.php?ID=77
and I am pulling from the Minutes table also:
http://www.example.com/clerk.php?ID=77
and they happen to have the same increment ID.
Is there some way to avoid this? Can I add a field parameter to the minutes to make sure that they don't have the same URL when pulling documents?
Create a integer field, or txt field?
i.e. http://www.example.com/clerk.php?ID=77&min=yes
If these are just documents, you could store them in a single table but have a column called type that differentiates between minutes and agendas. That way, IDs will be unique.
You could also extend the types column be a foreign key to a types table, so you can extend it to include additional document types such as spreadsheets in the future easily.
This would also aid the front-end display, as you would only need one query to fetch the documents within a batch or timeframe, but add them to differing arrays. For example:
// get the last 20 uploaded documents
$sql = "SELECT * FROM documents ORDER BY created DESC LIMIT 0,20";
$res = mysql_query($sql);
while ($doc = mysql_fetch_object($res)) {
$type = $doc->type;
if (!isset($docs[$type]) || !is_array($docs[$type])) {
$docs[$type] = array();
}
$docs[$type][] = $doc;
}
// loop over agendas
foreach ($docs['agendas'] as $agenda) {
echo '' . $agenda->title . '';
}
// loop over minutes
foreach ($docs['minutes'] as $minutes) {
echo '' . $minutes->title . '';
}
...
You say that the problem you are having is with URLs being duplicated. You have 2 different options to solve that.
The first is to create an identifier that tells you which table you want. You could have agenda.php and minutes.php, or you could have clerk.php?ID=77&type=1 and clerk.php?ID=77&type=2.
This will allow you to know which table you are referencing. This is also the approach I would recommend. There is no reason to avoid duplicate keys in different tables as long as you have a way of knowing which table you need.
The second option is to put all your records into a single table. You can then add a type column that specifies what type of document it is. You can then leave your clerk.php file alone, but need to change the upload page to populate the type column.
If you are just storing PDFs in the database, you shouldn't need anything in this table except id, type, and the document itself.