Sphinx returns wrong document ids - php

My sphinx search returns wrong results, if I search for a keyword, the document ids returned do not contain that keyword.
Here is how I created conf settings:
source source_name
{
type = mysql
sql_host = ******
sql_user = ******
sql_pass = ******
sql_db = ******
sql_port = # optional, default is 3306
sql_query_pre = SET CHARACTER_SET_RESULTS=utf8
sql_query_pre = SET NAMES utf8
sql_query = \
SELECT \
P.ID AS ID, P.TITLE AS TITLE, P.TITLE AS TITLE_SORT \
FROM \
PRODUCT P \
WHERE \
P.ISVALID='Y'
sql_attr_string = TITLE_SORT
sql_query_info = SELECT * FROM PRODUCT WHERE ID=$id
}
index index_name
{
source = source_name
path = /path/to/data/file_name
docinfo = extern
min_word_len = 1
charset_type = utf-8
}
indexer
{
mem_limit = 128M
}
searchd
{
listen = 3312 # port is deprecated from 2.1+
log = /path/to/log/searchd.log
query_log = /path/to/log/query.log
read_timeout = 5
max_children = 30
pid_file = /path/to/log/searchd.pid
max_matches = 1000
seamless_rotate = 0
preopen_indexes = 0
unlink_old = 1
compat_sphinxql_magics = 0
}
One important thing is that If I search with the test.php tool, I can see that the attribute value shows products with searched keywords, but document ids are still wrong, which makes me thing why it is returning wrong document ids
Another important thing is that on the same machine and same conf file, I created an index for another mysql database and it works fine.
Thanks
EDIT:
Here is an example:
I search for "professional" and I get this result
1. doc_id=33285, weight=102, title_sort=Wella Professional Bezoplachový kondicionér pro objem vlasï SP Volumize 150 ml, manufacturer_id=217, category_id=4648, min_price=0, product_rating=4294967295, filter_userid=(2714222508,3149373076)
2. doc_id=33286, weight=102, title_sort=Wella Professional àampon pro lesk vlasï SP Shine Define 250 ml, manufacturer_id=217, category_id=3046113, min_price=0, product_rating=4294967295, filter_userid=(2714222508,3149373076)
3. doc_id=33287, weight=102, title_sort=Wella Professional àampon pro barvené vlasy SP Color Save 250 ml, manufacturer_id=217, category_id=3046113, min_price=0, product_rating=4294967295, filter_userid=(2714222508,3149373076)
.. and so on ..
You can see that the title_sort field has the word professional in it, but the doc_ids returned (33285, 33286, 33287) are not these records.
This below is the id - title data from database
33285 Avon Čisticí tonikum na tělo proti akné ve spreji Blemish Clearing 100 ml
33286 Biotherm Pleťový krém a sérum 2v1 pro navrácení pružnosti normální až smíšené pleti Age Fitness Elastic 30 ml AKCE
33287 Avon Dětský šampon Barbie® 200 ml
While the results you see in the title_sort above are tied to these ids:
32854 Wella Professional Bezoplachový kondicionér pro objem vlasů SP Volumize 150 ml
32855 Wella Professional Šampon pro lesk vlasů SP Shine Define 250 ml
32856 Wella Professional Šampon pro barvené vlasy SP Color Save 250 ml

Sorry guys, issue is solved! It was a stupid mistake by our hosting support.
They created a new database and transferred old db data into it. The old database was corrupted.
I found this out when I created a duplicate table of the original and then tried to index it, sphinx gave me error that table does not exist, and that clicked my mind and I compared the database settings on site and sphinx.conf
I did not find an issue like mine on any forums and google, apart from one that had a similar configuration problem. So if you are reading this answer to find a solution for a similar problem, then you must check your configuration and confirm that your conf file is pointing to the correct database and table. This will save you days, may be weeks of pain :).

Related

How to execute a block of python code only when a row has been selected from mysql database

I've built a set of musical stairs with python, motion sensors and a raspberry pi, and a web app that lets you choose which type of instrument sound you want to make. The type of instrument is stored in a MySQL database which I have connected to the python code (which makes the sounds when a beam is broken) and to a web app which allows users to select the instrument type from the database.
I am just wondering is there a way of querying the database from the python code that would mean only when a row is selected from the database, run a particular block of code.
Eg, someone clicks "Drum" on the web app.
instrumentType "Drum" is selected from MySQL database
Drumsound.play() should run on the python code.
Is there any way I could do this on python?
This is for a raspberry pi 3 running python 2.7, mySQLdb5 and apache2.
import mysql.connector
mydb = mysql.connector.connect(
host="localhost",
user="*****",
pw="*****",
db="stairs"
)
cursor = mydb.cursor()
cursor.execute("SELECT variableValue FROM stairs WHERE variableValue =
'instrumentType'")
import RPi.GPIO as GPIO # GPIO
import pygame.mixer # To make sound
pygame.mixer.init()
''' GPIO setup '''
GPIO.setmode(GPIO.BCM) # GPIO setmode
GPIO.setwarnings(False)
'''Define steps and pins here'''
step1 = 4
'''Motion sensor setup here'''
GPIO.setup(step1, GPIO.IN, GPIO.PUD_UP)
'''Piano files here'''
C1 = pygame.mixer.Sound("piano/C1.wav")
'''Drum files here'''
drum1 = pygame.mixer.Sound("drum/C1.wav")
def play(pin):
sound = sound_pins[pin]
print("Playing note from pin %s" % pin)
sound.play()
'''Dictionary of steps and sounds'''
sound_pins = {
step1: C1,
step2: D,
step3: E,
step4: F,
step5: G,
step6: A,
step7: B,
step8: C2,
}
for pin in sound_pins:
GPIO.setup(pin, GPIO.IN, GPIO.PUD_UP)
GPIO.add_event_detect(pin, GPIO.RISING, play, 100)
You might consider using a python dictionary to store your 'instrument' type, against a function. This allows you to create a mapping of which instrument to which sound play.
You might be asking yourself, "this sounds like what I want my database to do?". As you use the same pygame.mixer.Sound("SomeWavePath") you could store this relationship between instrument names and their sound files in the database itself. This way you can expand your instrument selection by just adding to your database.
May I also recommend making the switch over to Python 3.x, as 2.x is shortly coming to an end of support (https://pythonclock.org/). This will also give you access to new language features and a wider range of library support moving forwards.
EDIT:
E.g Storing "instrument<->wav_path" mapping in your database.
# Query executing obtains 'wav path' from 'instrument' primary key.
choice = cur.fetchone() # "piano/C1.wav"
sound_to_play = pygame.mixer.Sound(choice)
E.g Attaching objects against your instruments
C1 = pygame.mixer.Sound("piano/C1.wav")
drum1 = pygame.mixer.Sound("drum/C1.wav")
instrument_dict = {
"piano": C1,
"drum": drum1
}
# Retrieve the sound to play
choice = cur.fetchone() # From our database, get 'drum' for example
sound_to_play = instrument_dict[ choice ]
# instrument_dict[ 'drum' ] => drum1 Sound Object.
Then:
sound_to_play.play()

Prestashop : use Product class in personnal script

I need to create a cron script to process products (add, delete, update) when I get them from a raw data file from a third party firm which sends a new file (data separated with ;) whenever a product is changed (added, deleted, updated).
I tried using the Product class (from Product.php) to add new products in the shop. The problem at first was that I got this error :
class 'ObjectModel' not found in importProducts.php
I found an apparent solution, which was to require_once the config and init file from the config folder in importProduct.php :
require_once(dirname(__FILE__).'/../config/config.inc.php');
require_once(dirname(__FILE__) . '/../init.php');
The require points to the right files. My file is located in a new folder, called "crontasks".
prestashop/crontaskt/import.php
(import.php starts importProduct.php)
But now there is a new error :
Fatal error: Uncaught You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 7
SELECT c.id_category
FROM ps_category_product cp
LEFT JOIN ps_category c ON (c.id_category = cp.id_category)
INNER JOIN ps_category_shop category_shop ON (category_shop.id_category = c.id_category AND category_shop.id_shop = 1)
WHERE cp.id_category NOT IN (18)
AND cp.id_product = thrown in D:\Travail\BUSCI\prestashop\prestashop\classes\db\Db.php on line 791
(they don't appear, but around the query are < br />< br />< pre >...< /pre > without the spaces in it)
Problem is, I checked in the database, and the columns are all there, and I don't get why there is a problem on that line.
Also, I could not find from which file that query was. I looked in Product.php, and I found one very similar but not exactly the same :
$result = Db::getInstance()->executeS('
SELECT c.`id_category`
FROM `'._DB_PREFIX_.'category_product` cp
LEFT JOIN `'._DB_PREFIX_.'category` c ON (c.`id_category` = cp.`id_category`)
'.Shop::addSqlAssociation('category', 'c', true, null, true).'
WHERE cp.`id_category` NOT IN ('.implode(',', array_map('intval', $categories)).')
AND cp.id_product = '.$this->id
);
line 1051 in Product.php, method updateCategories.
I could not find anything similar in Category.php or any other file I looked in. But this query is missing the inner join, so I don't know if that's really the one. I suppose the problem comes from the fact that there is nothing after the =, but I don't know if that's just a character limitation (amount of characters printed in an error) or if it is the error. I'm feeling a little lost.
I did not change any files from prestashop. This is the latest update I just installed a week ago on my computer. The tests are done in local, on my computer.
I would really appreciate some help.
Excuse me for my english, it's not amazing.
Edit :
So I searched through the whole project, in every file for the line
INNER JOIN '._DB_PREFIX_.'category_shop
and could not find it. I tried taking away the extra spaces, just in case, but could not find the file in which this sql query is in.

Warning: ldap_add(): Add: Object class violation

Over the past few days I've been trying to learn more about ldap, and I would now like to be able to create a new account on my phpldapadmin server from a webform. I have the values being passed back through php correctly, but I keep getting an objectclass violation error. I've scoured many different resources (including this one) and basically all that I can find is that the objectclass needs to match exactly how the dictionary is setup. I ran an export for some of the manually created users I already have working in there successfully, and this is an example of the output:
# LDIF Export for cn=api user,cn=students,ou=users,dc=myhost,dc=com
# Server: LDAP (ip)
# Search Scope: sub
# Search Filter: (objectClass=*)
# Total Entries: 1
#
# Generated by phpLDAPadmin (http://phpldapadmin.sourceforge.net) on June 4, 2016 3:15 pm
# Version: 1.2.2
version: 1
# Entry 1: cn=api user,cn=students,ou=users,dc=myhost,dc=co...
dn: cn=test user,cn=students,ou=users,dc=myhost,dc=com
cn: test
gidnumber: 502
givenname: test
homedirectory: /home/users/testuser
loginshell: /bin/sh
objectclass: inetOrgPerson
objectclass: posixAccount
objectclass: top
sn: tuser
uid: testuser
uidnumber: 1003
userpassword: {MD5}pass==
and I have tried mimicking it as closely as possible in my script (below), but I am still getting the violation error. No problems connecting or with any of the other fields, only the objectclass problem.
$ds = ldap_connect($AD_server);
if ($ds) {
ldap_set_option($ds, LDAP_OPT_PROTOCOL_VERSION, 3);
$r = ldap_bind($ds, $AD_Auth_User, $AD_Auth_PWD);
$info["cn"] = $user_full_name;
$info["sn"] = $user_username;
$info['objectclass'][0] = "top";
$info['objectclass'][1] = "posixAccount";
$info['objectclass'][2] = "inetOrgPerson";
$info['uid'] = $user_username;
$info['userpassword'] = $newPassw;
$info['loginshell'] = '/bin/sh';
$info['homedirectory'] = "/home/users/$user_username";
// add data to directory
$r = ldap_add($ds, $dn, $info);
ldap_close($ds);
} else {
echo "Unable to connect to LDAP server";
}
I've played around with the objectclasses and tried switching their positions or using only inetOrgPerson, and still no luck. Any thoughts?
When creating entries within LDAP you need to know which Attributes are "MUST" (required) for the ObjectClasses used when creating the entry.
In your example:
person MUST ( sn $ cn )
posixAccount MUST ( cn $ uid $ uidNumber $ gidNumber $ homeDirectory )
So to create the entry in LDAP you MUST have values for all of these:
sn
cn
uid
uidNumber
gidNumber
homeDirectory
You can tell which is required by a LDAP Query For Schema and reading each ObjectClass to determine which Attributes are "MUST" (required).
It looks like you need to make sure to pass every value back through. I was missing the uidnumber givenname and gidnumber fields. But now it works! :)

Error using node_load_multiple() API for Drupal 7

I am building a custom module for drupal 7 which delete all nodes of a content type. I need to load all nodes of content type. For it I have this code:
$type = "apunte";
$nodes = node_load_multiple(array(), array('type' => $type));
my problem is I have a lot of nodes of this type (almost 100000) and I always get error. If I try it with another type with only 2 or 3 nodes it works ok.
When I run my module in local (windows 8.1) I get error time exeeded (it never finish) and when I run in my server (debian 6) I get error 500. I use apache in both local and server.
How I could do it when I have too many nodes?
Thank you.
If you do a node_load_multiple of 100 000 nodes, you will get an array of 100 000 node object + their custom fields meaning that you will likely get millions of mysql requests and all this taking a big amount of ram.
To delete a huge amount of nodes, query your database to extract all the nids, split you array of nids in packets of 50 or 100 nids. And loop on each packet to make your node_load_multiple (why don t you use node_delete_multiple?).
If this still take.longer than the max.excution time of your php.ini and you can not change it. You can use the batch api of drupal so each packet will be dealt as a separate http request and so the max execution time will only affect the delete of 50/100 nodes.
Edit :
Try this :
$sql = 'SELECT nid FROM node n WHERE n.type = :type';
$result = db_query($sql, array(':type' => 'apunte'))->fetchCol();
foreach (array_chunk($result, 100) as $chunk) {
node_delete_multiple($chunk);
}

MySQL insert exceeding memory. Do I use a php script to send data in chunks?

INSERT INTO events (venue_id, artist_id, name, description)
SELECT e.id, e.artist_id, d.a_song, d.a_lyrics
FROM dump_sql AS d
INNER JOIN events AS e
ON d.a_album = e.name
Above is the mysql query I am using...works fine. Problem is that I have way too much data (150k records) that is too much it appears for the amount of the memory the server or mysql will allow.
I think at a minimum I need a php script to insert the data in chunks and perhaps increasing the memory allowance in the php, mysql and ???
Any and all help here would be most appreciated...I am a php newb and could use some help coming up with a script or any other pointers.
Thank you!
Error:
Node 0 DMA32 free:2776kB min:2788kB low:3484kB high:4180kB active_anon:211288kB inactive_anon:211276kB active_file:16kB inactive_file:0kB unevictable:0kB isolated(anon):128kB isolated(file):0kB present:500960kB mlocked:0kB dirty:0kB writeback:0kB mapped:116kB shmem:12kB slab_reclaimable:11372kB slab_unreclaimable:32752kB kernel_stack:904kB pagetables:10656kB unstable:0kB bounce:0kB writeback_tmp:0kB pages_scanned:640 all_unreclaimable? yes
lowmem_reserve[]: 0 0 0 0
Node 0 DMA: 12*4kB 22*8kB 0*16kB 0*32kB 0*64kB 0*128kB 1*256kB 1*512kB 1*1024kB 0*2048kB 0*4096kB = 2016kB
Node 0 DMA32: 676*4kB 12*8kB 4*16kB 0*32kB 0*64kB 0*128kB 0*256kB 0*512kB 0*1024kB 0*2048kB 0*4096kB = 2864kB
5001 total pagecache pages
4940 pages in swap cache
Swap cache stats: add 1565880, delete 1560940, find 743932/825587
Free swap = 0kB
Total swap = 1044216kB
131071 pages RAM
5577 pages reserved
2405 pages shared
118768 pages non-shared
Out of memory: kill process 24373 (httpd) score 410236 or a child
Killed process 24373 (httpd) vsz:1640944kB, anon-rss:345220kB, file-rss:28kB
Try changing the default value of
max_allowed_packet
in my.ini.
Change it to something like:
max_allowed_packet = 100M
and see if that helps.

Categories