How to remove unhashable duplicates from a list in Python? -
my data this:
[{u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': [u'/www/web'], u'server_port': u'80'}, {u'webpath': [u'/www/web'], u'server_port': u'80'}, {u'webpath': [u'/www/shanghu'], u'server_port': u'80'}, {u'webpath': [u'/www/shanghu'], u'server_port': u'80'}, {u'webpath': [u'/www/www/html/falv'], u'server_port': u'80'}, {u'webpath': [u'/www/www/html/falv'], u'server_port': u'80'}, {u'webpath': [u'/www/www/html/falv'], u'server_port': u'80'}, {u'webpath': [u'/www/falvhezi'], u'server_port': u'80'}, {u'webpath': [u'/www/test10'], u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': u'/etc/html', u'server_port': u'80'}, {u'webpath': [u'/www/400.ask.com'], u'server_port': u'80'}, {u'webpath': [u'/www/www'], u'server_port': u'80'}, {u'webpath': [u'/www/www'], u'server_port': u'80'}, {u'webpath': [u'/www/www'], u'server_port': u'80'}, {u'webpath': [u'/www/zhuanti'], u'server_port': u'80'}, {u'webpath': [u'/www/zhuanti'], u'server_port': u'80'}, {u'webpath': [u'/www/shanghu'], u'server_port': u'80'}]
my code this:
seen = set() new_webpath_list = [] webpath in nginxconfs: t = tuple(webpath.items()) if t not in seen: seen.add(t) new_webpath_list.append(webpath)
but script returns:
typeerror: "unhashable type: 'list'"
you creating tuples dictionaries make them hashable, there can still non-hashable lists inside tuples! instead, have "tuplefy" values.
t = tuple(((k, tuple(v)) (k, v) in webpath.items()))
note bit glitchy first entry in dict string, while others lists of strings. mend if/else
, should not necessary.
t = tuple(((k, tuple(v) if isinstance(v, list) else v) (k, v) in webpath.items()))
alternatively, memorize string represenations of dictionaries...
t = repr(webpath)
Comments
Post a Comment