I am looking to setup a Server that accepts a URL with a few variables in it which I'll use the $_GET statement to obtain. These variables will increment counters stored on the server. I was going to write these files to a file and then open it/write to it kind of like this http://www.developingwebs.net/phpclass/hitcounter.php however this is only showing for one variable. I changed the code to reflect multiple variables but am not entirely sure how to write the multiple variables to the file. I was using this:
$counters = ("counter.txt");
$increment = file($counters);
/* Bunch of if else ladders checking the $_GET statements and
incrementing $increments[] accordingly */
for($i; $i < 16; $i++) //write variables to file
fputs($fp, $increment[$i]);
Where $fp points to the text file I was using and $increment[ ] holds the variables being incremented and such. So would this work? And would this work with multiple people accessing this URL at the same time? It needs to keep an accurate count of all the variables regardless of how many people are accessing the page.
Example: a survey submitted online with 4 questions. Each response has 4 options to it so in total 16 variables being stored. People will be submitting their responses to the server randomly and possibly at the same time. I need to parse their response and update the counter accordingly even when multiple people are submitting at the same time.
Thanks for any help, hope I supplied enough detail but if not just ask questions.
EDIT: The URL is being sent from an Android device to the Server I don't know if that changes anything but just wanted to be clear. The Android device is submitting the survey responses.
Building on Graham's comment, you'd be far better off letting your database server handle responses, and building your totals as part of a reporting system rather than part of the form submission process.
Here's an example, in sort-of-meta code. First, your HTML form:
<form method="GET"> <!-- though I recommend POST instead -->
<input type="checkbox" name="ch[1]"> Checkbox 1
<input type="checkbox" name="ch[2]"> Checkbox 2
<input type="checkbox" name="ch[3]"> Checkbox 3
</form>
Then, the PHP that receives the form:
<?php
$qfmt = "INSERT INTO answers (question, answer) VALUES ('%s, '%s')";
foreach ($ch as $key => $value) {
if ($value == 'Yes') {
$query = sprintf($qfmt, $key, $value);
mysql_query($query);
}
}
print "<p>Thanks!</p>\n";
?>
Lastly, to gather your totals:
SELECT COUNT(*) FROM answers WHERE question = '1';
SELECT COUNT(*) FROM answers WHERE question = '2';
SELECT COUNT(*) FROM answers WHERE question = '3';
You can adapt this to handle other form input as well, and perhaps store a long-lived session cookie to let you detect whether the same browser gets used to fill out your form multiple times.
This could work, assuming you already have an existing file with the right number of newline characters to start you off. I do not believe however that it will maintain data integrity very well, as it could cause race conditions. To prevent race conditions, theoretically you would open the file, flock it, then write to the file. Other script instances attempting to read the flocked file will have to wait. A better approach might be using a database.
You could use the http_build_query function, to store the $_GET array using an string separator char like "||".
$data = http_build_query($_GET)."||";
$fp = fopen("counter.txt", "w");
fputs ($fp, $data);
fclose ($fp);
To read the info stored could do this
$fp = fopen("counter.txt", "r");
$contents = fread($fp, filesize("counter.txt"));
fclose($fp);
$array=explode("||",$contents);
foreach($array as $var){
parse_str($var, $data);
//-$data contains youre stored values.
}
Related
I already have a PHP script to upload a CSV file: it's a collection of tweets associated to a Twitter account (aka a brand). BTW, Thanks T.A.G.S :)
I also have a script to parse this CSV file: I need to extract emojis, hashtags, links, retweets, mentions, and many more details I need to compute for each tweet (it's for my research project: digital affectiveness. I've already stored 280k tweets, with 170k emojis inside).
Then each tweet and its metrics are saved in a database (table TWEETS), as well as emojis (table EMOJIS), as well as account stats (table BRANDS).
I use a class quite similar to this one: CsvImporter > https://gist.github.com/Tazeg/b1db2c634651c574e0f8. I made a loop to parse each line 1 by 1.
$importer = new CsvImporter($uploadfile,true);
while($content = $importer->get(1)) {
$pack = $content[0];
$data = array();
foreach($pack as $key=>$value) {
$data[]= $value;
}
$id_str = $data[0];
$from_user = $data[1];
...
After all my computs, I "INSERT INTO TWEETS VALUES(...)", same with EMOJIS. The after, I have to make some other operations
update reach for each id_str, if a tweet I saved is a reply to a previous tweet)
save stats to table BRAND
All these operations are scripted in a single file, insert.php, and triggered when I submit my upload form.
But everything falls down if there is too many tweets. My server cannot handle so long operations.
So I wonder if I can ajaxify parts of the process, especially the loop
upload the file
parse 1 CSV line and save it in SQL and display a 'OK' message each time a tweet is saved
compute all other things (reach and brand stats)
I'm not enough aware of $.ajax() but I guess there is something to do with beforeSend, success, complete and all the Ajax Events. Or maybe am I completely wrong!?
Is there anybody who can help me?
As far as I can tell, you can lighten the load on your server substantially because $pack is an array of values already, and there is no need to do the key value loop.
You can also write the mapping of values from the CSV row more idiomatically. Unless you know the CSV file is likely to be huge, you should also do multiple lines
$importer = new CsvImporter($uploadfile, true);
// get as many lines as possible at once...
while ($content = $importer->get()) {
// this loop works whether you get 1 row or many...
foreach ($content as $pack) {
list($id_str, $from_user, ...) = $pack;
// rest of your line processing and SQL inserts here....
}
}
You could also go on from this and insert multiple lines into your database in a single INSERT statement, which is supported by most SQL databases.
$f = fopen($filepath, "r");
while (($line = fgetcsv($f, 10000, ",")) !== false) {
array_push($entries, $line);
}
fclose($f);
try this, it may help.
for security reasons we need to disable a php/mysql for a non-profit site as it has a lot of vulnerabilities. It's a small site so we want to just rebuild the site without database and bypass the vulnerability of an admin page.
The website just needs to stay alive and remain dormant. We do not need to keep updating the site in future so we're looking for a static-ish design.
Our current URL structure is such that it has query strings in the url which fetches values from the database.
e.g. artist.php?id=2
I'm looking for a easy and quick way change artist.php so instead of fetching values from a database it would just include data from a flat html file so.
artist.php?id=1 = fetch data from /artist/1.html
artist.php?id=2 = fetch data from /artist/2.html
artist.php?id=3 = fetch data from /artist/3.html
artist.php?id=4 = fetch data from /artist/4.html
artist.php?id=5 = fetch data from /artist/5.html
The reason for doing it this way is that we need to preserve the URL structure for SEO purposes. So I do not want to use the html files for the public.
What basic php code would I need to achieve this?
To do it exactly as you ask would be like this:
$id = intval($_GET['id']);
$page = file_get_contents("/artist/$id.html");
In case $id === 0 there was something else besides numbers in the query parameter. You could also have the artist information in an array:
// datafile.php
return array(
1 => "Artist 1 is this and that",
2 => "Artist 2..."
)
And then in your artist.php
$data = include('datafile.php');
if (array_key_exists($_GET['id'], $data)) {
$page = $data[$_GET['id']];
} else {
// 404
}
HTML isn't your best option, but its cousin is THE BEST for static data files.
Let me introduce you to XML! (documentation to PHP parser)
XML is similar to HTML as structure, but it's made to store data rather than webpages.
If instead your html pages are already completed and you just need to serve them, you can use the url rewriting from your webserver (if you're using Apache, see mod_rewrite)
At last, a pure PHP solution (which I don't recommend)
<?php
//protect from displaying unwanted webpages or other vulnerabilities:
//we NEVER trust user input, and we NEVER use it directly without checking it up.
$valid_ids = array(1,2,3,4,5 /*etc*/);
if(in_array($_REQUEST['id'])){
$id = $_REQUEST['id'];
} else {
echo "missing artist!"; die;
}
//read the html file
$html_page = file_get_contents("/artist/$id.html");
//display the html file
echo $html_page;
I have searched and searched for a simple solution and I am afraid there may not be a simple one after all of my research...
Here is what I have... All users who are authorizing the twitter app are opting in to being added to a list (the csv file) to be followed by other users. Its a spot for users with the same interest to follow each other.
Here is what I am trying to achieve. I have users authorizing their twitter account which inserts their username into a csv file using php.
Users can only access the page once they have inserted their twitter which I am using oauth to authorize is an actual twitter account.
However, if the user visits the page again, they will be inserted into the csv file again.
I have very simple logic to prevent an insert if they refresh the page and whatnot (when the $username value is blank).
<?php
if ($username == '') //check if $username is blank so empty row isnt inserted
{}
else{
$cvsData = $username."\n";
$fp = fopen("Twitter.csv","a"); // $fp is now the file pointer to file $filename
if($fp){
fwrite($fp,$cvsData); // Write information to the file
fclose($fp); // Close the file
}
}
?>
There is no header row in the file. It is just a single column of twitter handles.
I know the logic sounds kind of simple:
open csv file
loop through each row searching for value of $username
if $username exists - close file
else append $username
close file
Unfortunately I am learning still and having trouble finding a solution. Perhaps its the loop part that I need help with? Or a starting point. I am not at all looking for the code to make this happen though that would save me time. I'm not afraid to learn it - just having trouble finding the solution on my own. Any help is very much appreciated!
array_unique — Removes duplicate values from an array
Put the the values from the csv into an array and run array_unique() on it.
I have written a HTML form to collect user's inputs for an order and also a PHP program to receive the order when Submit button is pressed. In addition I got to update a text file stored on the web server to reflect the order items. Can anyone explain how am I to go about updating a text file stored on the server? Thanks..
You should lock the file to protect the file from getting clobbered by concurrent updates sent by several users. There is a full example of locking and writing to a file in the flock function documentation: http://es.php.net/manual/en/function.flock.php
Or, to save yourself the trouble, use a proper database. SQLite is easy to use and requires no setting up: http://es.php.net/manual/en/book.sqlite3.php
Use fwrite:
$fp = fopen('data.txt', 'w');
fwrite($fp, $yourData);
fclose($fp);
UPDATE:
If I understood you right you need something like this:
if(!empty($noOfApples)){
$fp = fopen('data.txt', 'w+');
$count=fread($fp,filesize('data.txt'));
$count+=$noOfApples;
fwrite($fp, $count);
fclose($fp);
}
Simplest way to do this AND preserve array structure of form fields is to dump $_POST via serialize.
Write example, after a user clicks submit:
file_put_contents('myfile.txt', serialize( $_POST ) );
Read example:
$data = unserialize(file_get_contents('myfile.txt'));
The form field would look something like:
<input type="text" name="myfield" value="<?php echo $data['myfield'] ?>" />
Alternatively, you can technically do $_POST = unseralize(file_get_contents(... but it will obviously overwrite anything that a user might input.
Just store the user input with the key which should be the input name and the value which should be the content user input. In this way, you can get the user input in an array.
Then you can translate the key-value pair into a string with serialize function. Now you can store the string into file with those code:
$fp = fopen('user{$id}.txt', 'w');//replace the {$id} with user id
fwrite($fp, $dateFromUser);
fclose($fp);
When you want to show the user input, just read the file and unserialize the string you get from the file and fill the input with the data which is stored in the array with the same key. When you want to update the user input just do the process above again.
After spending 3 days on internet and struggling with so many different forums , i have found a match and similar case of my problem here.
Friends, I am zero in PHP, but still i have managed to do something to fulfill my requirement.
I am stuck with one thing now..So i need help on....
I am using one html+php form to submit database into mysql.
I created a display of that table through php script on a webpage.
Now i want a datepicker option on that displayed page by which i should able to select the date range and display the data of that date range from my mysql table.
And then take a export of data displayed of selected date range in excel.
This displayed page is login protected, so i want after login the next thing comes in should show a sate selection option which should be fromdate to to date , and then records should displayed from the database and i can take export of those displayed results in excel file.
The code i am using on this page is below which do not have any thing included for excel export and date picker script, I am pasting the code here and request you to please include the required code in it as required.
Thanks In advance
<?php
//database connections
$db_host = 'localhost';
$db_user = '***********';
$db_pwd = '*************';
$database = 'qserves1_uksurvey';
$table = 'forms';
$file = 'export';
if (!mysql_connect($db_host, $db_user, $db_pwd))
die("Can't connect to database");
if (!mysql_select_db($database))
die("Can't select database");
// sending query
$result = mysql_query("SELECT * FROM {$table} ORDER BY date desc");
if (!$result) {
die("Query to show fields from table failed");
}
$num_rows = mysql_num_rows($result);
$fields_num = mysql_num_fields($result);
echo "$num_rows";
echo "<h1></h1>";
echo "<table border='1'><tr>";
// printing table headers
for($i=0; $i<$fields_num; $i++)
{
$field = mysql_fetch_field($result);
echo "<td>{$field->name}</td>";
}
echo "</tr>\n";
// printing table rows
while($row = mysql_fetch_row($result))
{
echo "<tr>";
// $row is array... foreach( .. ) puts every element
// of $row to $cell variable
foreach($row as $cell)
echo "<td>$cell</td>";
echo "</tr>\n";
}
mysql_free_result($result);
?>
</body></html>
This isn't a "write my code for me, please" site, so you're going to need to be a little more engaging and pro-acive. But we can certainly provide some guidance. Let's see...
Currently you have a page which displays all records from a given table, is that correct? And you need to do two things:
Before displaying any records, have the user select a date range. And keep the date range selection on the page so the user can re-select.
Provide a button which lets the user export the selected records to Excel.
For either of these, you're going to need to add an actual form to the page. Currently there isn't one. For the date picker, I recommend (naturally) using the jQuery UI datepicker. So the form for that would look something like this:
<form method="POST" action="myPHPFile.php">
<input type="text" id="fromDate" name="fromDate" />
<input type="text" id="toDate" name="toDate" />
<input type="submit" name="filterDate" value="Submit" />
</form>
<script>
$(function() {
$("#fromDate").datepicker();
$("#toDate").datepicker();
});
</script>
You may have to wrap the JavaScript in a $(document).ready(){} in order to make it work correctly, you'll want to test that. Anyway, this will give you a form to submit the dates to your script. Wrap the parts of your script which output data in a conditional which determines if the form values are present or not. If they're not, don't fetch any records. If they are, do some basic input checking (make sure the values are valid values, make sure fromDate is before toDate, etc.) and construct your SQL query to filter by date range. (Do take care to avoid SQL injection vulnerabilities here.)
For the Excel output, you may be able to find a ready-made solution for you that just needs a little tinkering. If I were to create one from scratch, I'd probably just output to a .csv file rather than a full Excel file. Most users don't know/care the difference. In that case, you'd just want to either create a second script which is nearly identical to the existing one or add a flag to the existing one which switches between HTML and CSV output, such as via a hidden form field.
For the output of the CSV, first make sure you set your response headers. You'll want to write a header to tell the browser that you're outputting a CSV file rather than text/html, and possibly suggest a file name for the browser to save. Then, the form inputs the SQL query will all be pretty much the same as before. The only difference is in the "HTML" that's being output. Rather than HTML tags, you'd wrap the records in commas, double-quotes (where appropriate), and carriage returns.
There's really nothing special to outputting a "file" vs. "HTML" because the HTTP protocol has no distinction between the two. It's always just text with headers.
Now, I'm sure you have more questions regarding this. And that's fine. In fact, we like to encourage asking (and, of course, answering) questions here. So please feel free to ask for clarification either in comments on this answer (or other answers), or by editing and refining your original question, or by asking an entirely new question if you have a specific topic on which you need help. Ideally, a good question on Stack Overflow consists of sample code which you are trying to write, an explanation of what the code is supposed to be doing, a description of the actual resulting output of the code, and any helpful information relevant to the code. As it stands right now, your question provides code somewhat unrelated to what you're asking, and you're just requesting that we add some features to it outright for you.