Redirect post without database - php

I want to get content of a textbox posted on other page without database, means m creating a complaint box, and complaint should be posted on admin page. Every new complain should b at top and others below it. can you please help.

When you store some data, you are creating Data-Base as well
If you don't want to use SQL or similar (liteSQL, XML file DB,...)
you can use directory structure like this
/[year]
/[year]/[month]
/[year]/[month]/[day]
/[year]/[month]/[day]/[exact_timestamp]+[random_hash].[file_extension]
and save them in file structure, each complain as standalone text file

It may be best to just have one, delimited file. Then you only have to read from one place. Depending on the frequency of your complaints, you may want to have one file per day.
To avoid strange locking issues maybe do something similar to what Marek suggested and temporarily store each complaint in a standalone file. Then run a CRON to concatenate them every so often

Related

How Do I Make A New Page Every So Many Inputs In PHP and SQL?

I am trying to make it so users can enter text files and a short message near it and it will be visible for everyone. I am using PHP and SQL (MariaDB/MySQL for the database) and I want it so every like ten for example inputs it will make a new page. So for example if there is ten files up on the page then if someone puts another one in it will automatically make a page two and have the oldest file submitted be there. How can I do this. I have seen other sites use a GET method and have it be something like ?page=2 . How can I do this myself?
What you're looking for is "Pagination".
Pagination using MySQL LIMIT, OFFSET
Though that can look a bit scary at first. However, if you use a framework such as Laravel it can easily be done, see:
https://laravel.com/docs/8.x/pagination#introduction

How to parse CSV file in PHP and store fields in a database?

I need help parsing the following a CSV file in PHP, so I can insert the contents into a database.
I know I use file_get_contents() but after that I feel a bit lost.
What I'd like to store.
Collection1 - events.text & date
Collection2 - position & name.text & total
I'm not sure how best structure the data to insert into a database table.
"**collection1**"
"events.href","**events.text**","**date**","index","url"
"tur.com/events/classic.html","John Deere Classic","Thursday Jul 9
- Sunday Jul 12, 2015","1","tur.com/r.html"
"collection2"
"**position**","name.href","**name.text**","**total**","index","url"
"--","javascript:void(0);","Scott","--","2","tur.com/r.html"
"--","javascript:void(0);","Billy","--","3","tur.com/r.html"
"--","javascript:void(0);","Jon","--","4","tur.com/r.html"
"--","javascript:void(0);","Bill","--","5","tur.com/r.html"
"--","javascript:void(0);","Tim","--","6","tur.com/r.html"
"--","javascript:void(0);","Carlos","--","7","tur.com/r.html"
"--","javascript:void(0);","Robert","--","8","tur.com/r.html"
"--","javascript:void(0);","Rod","--","9","tur.com/r.html"
As per your previous question, I think this needs to be broken down into sections. As it stands it is rather too broad to answer.
Read the information using file_get_contents(). Make sure this works first, by echoing it to the console. (It sounded from your other question that you felt this would not work if the URL does not have a .csv suffix. It should work regardless of the file extension - try it. If it fails it may be dependent on cookies or JavaScript or some other problem).
Design and create your table structure in MySQL. It seems like you have two tables. They should both have a primary key. Are they related in some fashion? If so, perhaps one has a foreign key to the other one?
Explode your text file on the new line character and loop across the resulting array of lines.
If your CSV data has a title row in the first row position, delete that from your array.
For each line, read the elements of interest using PHP's build-in CSV parsing functions, and store them in variables.
Pass these variables to a custom function that saves the data.
For each save, you'll need to do an INSERT. I recommend using PDO here. Make sure you bind your parameters.
Where you get stuck on a specific problem, you can ask a new and focussed question. At present, the task is to break things down into discrete and researchable pieces.
One trick worth remembering is this shortcut to the PHP manual. If you do not know how fgetcsv works, for example, type php.net/fgetcsv into your browser address bar, and the PHP site will find the function for you. The documentation is excellent.

fopen and write in file

I have some questions about fopen
The first question it’s when i go for add new entry always put to the end of file and no start the file, for example:
$fp=fopen("text.dat","a");
fputs($fp,"Hello 1"."\n");
fclose($fp);
Always the results in this file show to the end:
Hello 1
Hello 2
Hello 3
And no as I want, insert the new comment to the first place for show this as:
Hello 3
Hello 2
Hello 1 ( The most old entry )
By other side my second question, for example if i have 10 users and this 10 users to the same time insert one entry or post inside this text file, it’s possible or can give me some error? Or I need use flock until save each post, which it’s the best method for no give me problems when some users want change something in the file in the same time?
There's no way to prepend the file automatically. So, it is better to store the existing contents in a temp file and then insert it in the file.
$fp=fopen("text.dat","w");
fwrite($fp,"Hello 1"."\n".fread($fp));
fclose($fp);
This will be outputting as:
Hello 3
Hello 2
Hello 1
But as far as lock is considered, I don't think it is possible, or am not the right person to answer for this.
When you write to a file it'll always append at the end. There's no workaround it which I'm aware of, but in order to achieve what you want (which is to display the lines in reverse order) you can read the lines into an array and display the array in reverse order.
As for locking, only one process can hold the lock to a file, so you don't really have to do anything cause if two users try to update the same file at the same time - only one of them will succeed - which actually creates a different problem (one of the users will lose her post). In order to work around it you should send to the backend both the original copy of the post and the new version submitted by the user, before you save the user's edit - check that the original version is updated. If it's not up-to-date it means that another user changed it meanwhile. The "user-friendly" behavior would be to return an error to the user saying that he's version is not up-to-date but also include his edits - so that he won't have to re-write everything from scratch.
For that you need a database, which is more suited for multi-user things and sorting,
Or else use a subdirectory and create every message in its own file, with a file name made up of a sortable timestamp: yyyymmddhhmmss. But then you need to prevent directory caching.
As everyone has the right to be stubborn/cut of an edge: see file_get_contents to load all the contents, and file_put_contents.

Using PHP to replace a line in a flat-file database

There are quite a few different threads about this similar topic, yet I have not been able to fully comprehend a solution to my problem.
What I'd like to do is quite simple, I have a flat-file db, with data stored like this -
$username:$worldLocation:$resources
The issue is I would like to have a submit data html page that would update this line based upon a search of the term using php
search db for - $worldLocation
if $worldLocation found
replace entire line with $username:$worldLocation:$updatedResources
I know there should be a fairly easy way to get this done but I am unable to figure it out at the moment, I will keep trying as this post is up but if you know a way that I could use I would greatly appreciate the help.
Thank you
I always loved c, and functions that came into php from c.
Check out fscanf and fprintf.
These will make your life easier while reading writing in a format. Like say:
$filehandle = fopen("file.txt", "c");
while($values = fscanf($filehandle, "%s\t%s\t%s\n")){
list($a, $b, $c) = $values;
// do something with a,b,c
}
Also, there is no performance workaround for avoiding reading the entire file into memory -> changing one line -> writing the entire file. You have to do it.
This is as efficient as you can get. Because you most probably using native c code since I read some where that php just wraps c's functions in these cases.
You like the hard way so be it....
Make each line the same length. Add space, tab, capital X etc to fill in the blanks
When you want to replace the line, find it and as each line is of a fixed length you can replace it.
For speed and less hassle use a database (even SQLLite)
If you're committed to the flat file, the simplest thing is iterating through each line, writing a new file & changing the one that matches.
Yeah, it sucks.
I'd strongly recommend switching over to a 'proper' database. If you're concerned about resources or the complexity of running a server, you can look into SQLite or Berkeley DB. Both of these use a database that is 'just a file', removing the issue of installing and maintaining a DB server, but still you the ability to quickly & easily search, replace and delete individual records. If you still need the flat file for some other reason, you can easily write some import/export routines.
Another interesting possibility, if you want to be creative, would be to look at your filesystem as a database. Give each user a directory. In each directory, have a file for locations. In each file, update the resources. This means that, to insert a row, you just write to a new file. To update a file, you just rewrite a single file. Deleting a user is just nuking a directory. Sure, there's a bit more overhead in slurping the whole thing into memory.
Other ways of solving the problem might be to make your flat-file write-only, since appending to the end of a file is a trivial operation. You then create a second file that lists "dead" line numbers that should be ignored when reading the flat file. Similarly, you could easily "X" out the existing lines (which, again, is far easier than trying to update lines in a file that might not be the same length) and append your new data to the end.
Those second two ideas aren't really meant to be practical solutions as much as they are to show you that there's always more than one way to solve a problem.
ok.... after a few hours work..this example woorked fine for me...
I intended to code an editing tool...and use it for password update..and it did the
trick!
Not only does this page send and email to user (sorry...address harcoded to avoid
posting aditional code) with new password...but it also edits entry for thew user
and re-writes all file info in new file...
when done, it obviously swaps filenames, storing old file as usuarios_old.txt.
grab the code here (sorry stackoverflow got VERY picky about code posting)
https://www.iot-argentina.xyz/edit_flat_databse.txt
Is that what you are location for :
update `field` from `table` set `field to replace` = '$username:$worldlocation:$updatesResources' where `field` = '$worldLocation';

Store data using a txt file

This question might seem strange but I have been searching for an answer for a long time and I couldn't find any.
Let's suppose you have a blog and this blog has many post entries just like any other blog. Now each post can have simple user comments. No like buttons or any other resource that would require data management. Now the query is: Can I store user comments on a single text file? Each post will be associated to a text file that holds the comments. So, if I have n posts I'll have n text files.
I know I can perfectly do this, but I have never seen it anywhere else and no one is talking about it. For me this seems better than storing all coments from all posts in a single mysql table but I don't know what makes it so bad that no one has implemented it yet.
Storing comments in text files associated with corresponding post? Lest see if it's good idea.
Okay adding new comments easy - write new text to the file. But what about format of your data? CSV? Ok then you would have to parse it before rendering.
Paging. If you have a lot of comments you may consider creating paging navigation for it. It can be done easily, sure. But you would need to open the file and read all the records to extract say 20.
Approve your comments. Someone posted new comment. You place it with pending status. So.. In admin panel you need to find those marked comments and process then accordingly - save or remove. Do you think it's convinient with text files? The same if use decided to remove its comment himself.
Reading files if you have many comments and many posts will be slower the it would be in case of database.
Scalability. One day you deside to extend you comments functionality to let one comment to respond to another. How would you do it with text files? Or example from comments by nico: "In 6 months time, when you will want to add a rating field to the comments... you'll have a big headache. Or, just run a simple ALTER query".
This is just for beggining. Someone may add something.
Well, there are good reasons why this isn't done. I can't possibly name them all, but the first things that come to mind:
Efficiency
Flexibility
Databases are much more efficient and flexible than plain text files. You can index, search and assign keys to individual comments and edit and delete any comments based on their key.
Furthermore, you'd get a huge pile of text files if the blog is quite big. While in itself that's not a problem, if you all save them in one directory, it can grow out of proportion and really increase the access time needed to find and open a specific text file.

Categories