We have some users that have added the same tag and category to a blog. When they do this, they are unable to edit or use those tags / categories and receive an error basically stating “You need a higher level of permission.”
So far i've determined the actual error being thrown is Term ID is shared between multiple taxonomies error that we have been receiving when trying to edit / delete certain post categories or post tags.
Debugging this further, the issue seems to be happening at creation time. When I look at the tables in the database the terms table looks fine, but the term_taxonomy table does not. The same term_id is being saved for both entries.
MariaDB [wordpress]> select * from wp_62_terms;
+---------+-----------------------+-----------------------+------------+
| term_id | name | slug | term_group |
+---------+-----------------------+-----------------------+------------+
| 1 | Uncategorized | uncategorized | 0 |
| 2 | Blogroll | blogroll | 0 |
| 107691 | ppppp | ppppp | 0 |
| 107692 | ppppp | ppppp | 0 |
+---------+-----------------------+-----------------------+------------+
MariaDB [wordpress]> select * from wp_62_term_taxonomy;
+------------------+---------+---------------+-------------+--------+-------+
| term_taxonomy_id | term_id | taxonomy | description | parent | count |
+------------------+---------+---------------+-------------+--------+-------+
| 1 | 1 | category | | 0 | 19 |
| 2 | 2 | link_category | | 0 | 0 |
| 34 | 107691 | post_tag | | 0 | 0 |
| 35 | 107691 | category | | 0 | 0 |
+------------------+---------+---------------+-------------+--------+-------+
I have been debugging this further and captured the $wpdb->last_query value for that able insert and those read as follows:
INSERT INTO `wp_62_term_taxonomy` (`term_id`, `taxonomy`, `description`, `parent`, `count`) VALUES (107691, 'post_tag', '', 0, 0)
INSERT INTO `wp_62_term_taxonomy` (`term_id`, `taxonomy`, `description`, `parent`, `count`) VALUES (107692, 'category', '', 0, 0)
The INSERT sql shows the correct term_id however - that is not what is getting stored in the database.
Manually updating the database value does correct the problem.
Any thoughts / ideas are appreciated!
This issue was caused by what seems to be a legacy and a potentially unused feature of WordPress Multisite. When the global_terms_enabled value is enabled in the wp_sitemeta table, it will create a global term list in the wp_sitecategories table.
Within the wp-includes\ms-functions.php file, the global_terms function is called when a new taxonomy term is added. Within that function, it does a check in the wp_sitecategories table to see if that term exists. However - that logic in that function is flawed and does not account for the same value across multiple taxonomies. It detects it as a duplicate, and then issues wpdb queries to change the term_id values as noted above.
This was logged as a bug in the WordPress Core here: https://core.trac.wordpress.org/ticket/55979
Changing the sitemeta value of global_terms_enabled from 1 to 0 solved this behavior.
Related
My table have these data:
ID | company_id | code | description
01 | NULL | CD01 | Standard description CD01
02 | NULL | XYZU | Standard description XYZU
03 | 1 | CD01 | Custom description CD01 for company 1
04 | 2 | CD01 | Custom description CD01 for company 2
I need to extract all 'code' from this table but showing a single product code only once.
If exists a record witht company_id <> '' I show that, instead if doesn't exists I show the record with standard description.
Starting from the sample data, if I wanted to show the articles for the company_id = 1, I expect to have this output:
ID | company_id | code | description
02 | NULL | XYZU | Standard description XYZU
03 | 1 | CD01 | Custom description CD01 for company 1
Have you got some suggest to do it?
Thank you
for removing duplicate entries from database results by group by you should disable ONLY_FULL_GROUP_BY for mysql. don't do that in mysql and don't disable strict mode! laravel itself sets some modes.
this is the overall solution for disabling this mode.
but in laravel you should also try another thing:
go to YouProjectFolder/vendor/laravel/framework/src/Illuminate/Database/Connectors/MySqlConnector.php
at the end of the file fin strict_mode() function and just remove ONLY_FULL_GROUP_BY fr0m the string within the function.
(i just saw this solution in a stackoverflow post. unfortunately i didn't find that post)
I am doing a project in PHP, MySQL. I need to create a database in which there is a category to which there can be many subcategories ( like for a course book I can have physics, computer science, mathematics etc.).
Also there can be some categories which do not have subcategories (say Mythological Books).
The administrator has to add new category or subcategory via a form in PHP page which then updates the database.
How should I design the structure of my table where if admin for following two cases to be incorporated:
Admin only wants a subcategory to be added then upon selecting its parent category, it should be done.
Admin wants to add a new parent category and there exists no subcategory for it.
Please suggest how should I create my table(s).
Create a table called 'categories' with columns 'id', 'name', 'parent'.
Then use it like this:
1, Hardware, null
2, Storage, 1
3, Video, 1
4, Harddisks, 2
...
Here is a simple example with books:
Categories table:
+----+--------------------+--------+
| id | name | parent |
+----+--------------------+--------+
| 1 | Comics | NULL |
| 2 | Programming | NULL |
| 3 | SQL/PHP | 2 |
| 4 | Java | 2 |
| 5 | Marvel | 1 |
| 6 | Mythological Books | NULL |
+----+--------------------+--------+
Books table:
+----+---------------------------------------+----------+
| id | name | category |
+----+---------------------------------------+----------+
| 1 | PHP/MYSQL for dummies | 3 |
| 2 | Iron Man and the prisoners of azkaban | 5 |
+----+---------------------------------------+----------+
If you want to show all the books in the Programming>SQL/PHP category you just simply get them with a select query.
mysql> select * from books where category = 3;
+----+-----------------------+----------+
| id | name | category |
+----+-----------------------+----------+
| 1 | PHP/MYSQL for dummies | 3 |
+----+-----------------------+----------+
1 row in set (0,00 sec)
I have a problem that I can't figure out, I'm not experienced enough (or it can't be done!) I've trawled Google for the answer with no luck.
I have a system where I need to assign an ID to each row, with the ID from another table. The catch is that the ID must be unique for each row created in this batch.
Basically I'm selling links on my Tumblr accounts, I need to assign a Tumblr account to each link that a customer purchases but I want to assign all possible Tumblr accounts so that duplicates are kept to the minimum possible.
The URLs - each link that a customer buys is stored in this table (urls_anchors):
+----------+--------------------+------------+-----------+------+
| clientID | URL | Anchor | tumblrID | paid |
+----------+--------------------+------------+-----------+------+
| 1234 | http://example.com | Click here | 67 | Yes |
| 1234 | http://example.com | Click here | 66 | Yes |
| 1234 | http://example.com | Click here | 65 | Yes |
| 1234 | http://example.com | Click here | 64 | Yes |
+----------+--------------------+------------+-----------+------+
All of the Tumblr accounts available for allocation are stored in this table (tumblrs):
+----------+-------------------+------------+
| tumblrID | tumblrURL | spacesLeft |
+----------+-------------------+------------+
| 64 | http://tumblr.com | 9 |
| 65 | http://tumblr.com | 9 |
| 66 | http://tumblr.com | 9 |
| 67 | http://tumblr.com | 9 |
+----------+-------------------+------------+
My best attempt at this has been the following query:
INSERT INTO `urls_anchors` (`clientID`, `URL`,`Anchor`, `tumblrID`, `paid`) VALUES ('$clientID','$url','$line', (SELECT #rank:=#rank+1 AS tumblrID FROM tumblrs WHERE #rank < 68 LIMIT 1), 'No')
Which works but keeps adding incrementally indefinitely, when there are only X number of Tumblrs to assign. I need the query to loop back around when it reaches the last row of Tumblrs and run through the list again.
Also i'm using this in a PHP script, I'm not sure if that's of any significance.
Any help would be MASSIVELY appreciated!
Thanks for looking :)
You can use a SELECT query as the source of data to insert.
INSERT INTO urls_anchors (`clientID`, `URL`,`Anchor`, `tumblrID`, `paid`)
SELECT '$clientID','$url','$line', tumblrID, 'No'
FROM tumblrs
LIMIT $number_of_rows
DEMO
This will assign $number_of_rows different tumblrID values to the rows.
If you need to assign more tumbler IDs than are available, you'll need to do this in a loop, subtracting the number of rows inserted from $number_of_rows each time. You can use mysqli_affected_rows() to find out how many rows were inserted each time.
I have a web app in which I show a series of posts based on this table schema (there are thousands of rows like this and other columns too (removed as not required for this question)) :-
+---------+----------+----------+
| ID | COL1 | COL2 |
+---------+----------+----------+
| 1 | NULL | ---- |
| 2 | --- | NULL |
| 3 | NULL | ---- |
| 4 | --- | NULL |
| 5 | NULL | NULL |
| 6 | --- | NULL |
| 7 | NULL | ---- |
| 8 | --- | NULL |
+---------+----------+----------+
And I use this query :-
SELECT * from `TABLE` WHERE `COL1` IS NOT NULL AND `COL2` IS NULL ORDER BY `COL1`;
And the resultant result set I get is like:-
+---------+----------+----------+
| ID | COL1 | COL2 |
+---------+----------+----------+
| 12 | --- | NULL |
| 1 | --- | NULL |
| 6 | --- | NULL |
| 8 | --- | NULL |
| 11 | --- | NULL |
| 13 | --- | NULL |
| 5 | --- | NULL |
| 9 | --- | NULL |
| 17 | --- | NULL |
| 21 | --- | NULL |
| 23 | --- | NULL |
| 4 | --- | NULL |
| 32 | --- | NULL |
| 58 | --- | NULL |
| 61 | --- | NULL |
| 43 | --- | NULL |
+---------+----------+----------+
Notice that the IDs column is jumbled thanks to the order by clause.
I have proper indexes to optimize these queries.
Now, let me explain the real problem. I have a lazy-load kind of functionality in my web-app. So, I display around 10 posts per page by using a LIMIT 10 after the query for the first page.
We are good till here. But, the real problem comes when I have to load the second page. What do I query now? I do not want the posts to be repeated. And there are new posts coming up almost every 15 seconds which make them go on top(by top I literally mean the first row) of the resultset(I do not want to display these latest posts in the second or third pages but they alter the resultset size so I cannot use LIMIT 10,10 for the 2nd page and so on as the posts will be repeated.).
Now, all I know is the last ID of the post that I displayed. Say 21 here. So, I want to display the posts of IDs 23,4,32,58,61,43 (refer to the resultset table above). Now, do I load all the rows without using the LIMIT clause and display 10 ids occurring after the id 21. But for that I will have to interate over thousands of useless rows.But, I cannot use a LIMIT clause for the 2nd,3rd... pages that is for sure. Also, the IDs are jumbled, so I can definitely not use WHERE ID>.... So, where do we go now?
I'm not sure if I've understood your question correctly, but here's how I think I would do it:
Add a timestamp column to your table, let's call it date_added
When displaying the first page, use your query as-is (with LIMIT 10) and hang on to the timestamp of the most recent record; let's call it last_date_added.
For the 2nd, 3rd and subsequent pages, modify your query to filter out all records with date_added > last_date_added, and use LIMIT 10, 10, LIMIT 20, 10, LIMIT 30, 10 and so on.
This will have the effect of freezing your resultset in time, and resetting it every time the first page is accessed.
Notes:
Depending on the ordering of your resultset, you might need a separate query to obtain the last_date_added. Alternatively, you could just cut off at the current time, i.e. the time when the first page was accessed.
If your IDs are sequential, you could use the same trick with the ID.
Hmm..
I thought for a while and came up with 2 solutions. :-
To store the Ids of the post already displayed and query WHERE ID NOT IN(id1,id2,...). But, that would cost you extra memory. And if the user loads 100 pages and the ids are in 100000s then a single GET request would not be able to handle it. At least not in all browsers. A POST request can be used.
Alter the way you display posts from COL1. I don't know if this would be a good way for you. But, it can save you bandwith and make your code cleaner. It may also be a better way. I would suggest this :- SELECT * from TABLE where COL1 IS NOT NULL AND COL2 IS NULL AND Id>.. ORDER BY ID DESC LIMIT 10,10. This can affect the way you display your posts by leaps and bounds. But, as you said in your comments that you check if a post meets a criteria and change the COL1 from NULL to the current timestammp, I guess that the newer the posts the, the more above you want to display them. It's just an idea.
I assume new posts will be added with a higher ID than the current max ID right? So couldn't you just run your query and grab the current max ID. Then when you query for page 2 do the same query but with "ID < max_id". This should give you the same result set as your page 1 query because any new rows will have ID > max_id. Hope that helps?
How about?
ORDER BY `COL1`,`ID`;
This would always put IDs in order. This will let you use:
LIMIT 10,10
for your second page.
I'm attempting to add Breadcrumbs to my website using a MySQL table, and I'm having difficulty at the moment.
I have a table named 'includes' created that stores information about the category, page, subpage, the title, and the ref (url) of the page. Category, Page, and Subpage are all php parameters passed from the page the user is on
My table is laid out like this:
|----------------------------------------------------------------|
| ID | Category | Page | Subpage | Title | Ref |
|----------------------------------------------------------------|
| 0 | | | | Home | ... |
| 1 | Software | | | Software | ... |
| 2 | Software | Desktop | | Desktop Software | ... |
| 3 | Software | Mobile | | Mobile Software | ... |
| 4 | Software | Desktop | Blah | Blah Blah | ... |
| ...
|----------------------------------------------------------------|
What I'm trying to do is make a query that will return only the required steps back to home for the breadcrumbs.
In other words, if the user is on "example.com/software/desktop/blah", the query will return rows 0,1,2, and 4. Or if I was on /software/mobile, it would only return rows 0,1, and 3.
My current attempts have been things like the following:
SELECT * FROM `includes` WHERE
`category` IS NULL AND `page` IS NULL AND `subpage` IS NULL OR
`category`='$category' AND `page` IS NULL AND `subpage` IS NULL OR
`category`='$category' AND `page`='$page' AND `subpage` IS NULL OR
`category`='$category' AND `page`='$page' AND `subpage`='$subpage'
Which not only don't work, but also seem more complex than it should have to be.
I'm probably overcomplicating this, or possibly just doing an entirely wrong method, which is why I've turned here.
Does anyone have a possible solution to this? Should I be looking at a more complex query? (admittedly, SQL is not my forte) Or should I be looking at a new SQL table, or possibly an entirely different method?
What you have is a hierarchical structure. The data is set up with parent-child relationships. There is a good description on how to work with hierarchical data here: http://explainextended.com/2009/03/17/hierarchical-queries-in-mysql/
You can make a self relation table like this
id | parent_it | title | Ref
1 | 0 | Home | ...
2 | 1 | Software | ...
3 | 2 | Desktop | ...
4 | 2 | Mobile | ...
5 | 3 | Blah | ...
So your query should get the last element
SELECT * FROM includes WHERE
tilte = 'Blah'
And then get the parent ID title and so on , like this the table structure will be better from my point of view & experience
OR
Generate your query based on the values you get , with simple loop count the arguments and based on that generate the query string then execute it
I hope this can help :)