Design and implement a data structure for . It should support the following operations: get
and put
.
get(key)
- Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
put(key, value)
- Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item. Follow up:
Could you do both operations in O(1) time complexity?Example:
LRUCache cache = new LRUCache( 2 /* capacity */ );cache.put(1, 1);cache.put(2, 2);cache.get(1); // returns 1cache.put(3, 3); // evicts key 2cache.get(2); // returns -1 (not found)cache.put(4, 4); // evicts key 1cache.get(1); // returns -1 (not found)cache.get(3); // returns 3cache.get(4); // returns 4
这个题目实际上就是用linked list去记录cache中存在的点,因为delete和move,add都是O(1) 的复杂度,然后用hash 去存cache中现在有的数。
Note:因为存的是key,所以linked list里面还多一个key的值。另外由于如果用单向的linked list,我们要知道它的pre node,所以在hash中我们存的实际上是该node的pre node,因此也需要有个dummy node。另外因为有去掉head,add to tail的操作,所以可以另写两个function,有助于debug,也使程序看起来更加有条理。
Code:
class ListNode: def __init__(self, key, value): self.key = key self.val = value self.next = Noneclass LRUCache: def __init__(self, capacity): self.capacity = capacity self.d = dict() # hash self.dummy = ListNode(0) self.tail = self.dummy def popHead(self): head = self.dummy.next self.dummy.next = head.next self.d.pop(head.key) if head.next: self.d[head.next.key] = self.dummy def addToTail(self, node): self.tail.next = node self.d[node.key] = self.tail self.tail = node def moveToTail(self, pre): if pre.next == self.tail: return node = pre.next pre.next = node.next if node.next: self.d[node.next.key] = pre node.next = None self.addToTail(node) def get(self, key): if key in self.d: self.moveToTail(self.d[key]) return self.d[key].next.val return -1 def put(self, key, value): if key in self.d: self.moveToTail(self.d[key]) self.d[key].next.val = value else: self.addToTail(ListNode(key, value)) if len(self.d) > self.capacity: self.popHead()