我试图在 python 中理解/实现基于 minHash 的 jaccard 相似性。主要目标是在 MapReduce 中使用它。但是我不清楚哈希函数和签名长度的选择如何影响计算jaccard相似度的错误率。从维基百科中,我发现与计算的 jaccard 相似度相关的签名 (K) 和错误 (e) 的一般长度是 k = O(1/e^2)。我尝试在 python 中实现 minHash:
import random
import sys
#ERROR_THRESHOLD = 0.05
#SIG_LENGTH = int(1/(ERROR_THRESHOLD**2))
_memomask = {}
def hash_values(n, x):
"""Compute n different hash values"""
values = []
for i in range(n):
mask = _memomask.get(i)
if mask is None:
random.seed(i)
mask = _memomask[i] = random.getrandbits(32)
values.append((hash(str(x)) % mask))
return values
def compare_signatures(x, y):
"""Compare MinHash Signatures"""
size = len(x)
if size != len(y): raise Exception("Different signature length")
if size == 0: raise Exception("signature length is zero")
counter = 0
for i in range(size): counter += int(x[i] == y[i])
return counter/float(size)
items = [['A',3], ['A',6], ['A',9], ['B',2], ['B',4], ['B',6], ['B',8]]
for SIG_LENGTH in [1, 10, 100, 400, 1000]:
#Step 1: Compute Hash Signature for each token
data = []
for item in items:
values = hash_values(SIG_LENGTH, item[1])
key = item[0]
data.append((key, values))
#Step 2: Group by Key and compute MinHash for each index
signatures = {}
for item in data:
key = item[0]
values = item[1]
if key not in signatures: signatures[key] = [-1.0]*SIG_LENGTH
cur_signature = signatures[key]
signatures[key] = [(values[i] if cur_signature[i] == -1.0 else min(values[i], cur_signature[i])) for i in range(SIG_LENGTH)]
#Step 3: Compute Probability of minHash signature to be same
keys = signatures.keys()
key_length = len(keys)
print "Jaccard Similarity based on signature of length {0}".format(SIG_LENGTH)
for i in range(key_length):
x_key = keys[i]
x_sig = signatures[x_key]
for j in range(i+1,key_length):
y_key = keys[j]
y_sig = signatures[y_key]
print "J({0},{1}) = {2}".format(x_key, y_key, compare_signatures(x_sig, y_sig))
在我的测试中,我发现准确度随着签名长度的增加而增加,但随后它开始下降(或保持稳定)。我想知道是不是因为选择了哈希函数。如果是,有人可以建议使用一个好的散列函数。
我找到了一些相关的帖子,但仍然不清楚: minhash 算法中需要多少个哈希函数