我是靠谱客的博主 内向羊,最近开发中收集的这篇文章主要介绍[LeetCode]LRU Cache,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: get and set.

get(key) - Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.
set(key, value) - Set or insert the value if the key is not already present. When the cache reached its capacity, it should invalidate the least recently used item before inserting a new item.

Analysis:

首先,对于cache,如果希望有O(1)的查找复杂度,肯定要用hashmap来保存key和对象的映射。对于LRU而言,问题在于如何用O(1)解决cache entry的替换问题。

简单的说,cache的存储是一个链表的话,那么只要保证从头到尾的顺序就是cache从新到旧的顺序就好了,对于任何一个节点,如果被访问了,那么就将该节点移至头部。如果cache已满,那么就把尾部的删掉,从头部插入新节点。

所以,需要用到两个数据结构

1. hashmap, 保存key和对象位置的映射

2. list,保存对象新旧程度的序列。不一定是list,也可以用vector,不过list的好处是已经实现了头尾操作的api,vector的话,还要自己写,麻烦。

用单向链表复杂度为O(lgn)

双向链表为O(1)

Java 双向链表

public class LRUCache {
    class CacheEntry{
		int key;
		int value;
		CacheEntry pre;
		CacheEntry next;
		public CacheEntry(int k, int v) {
			this.key = k;
			this.value = v;
		}
	}
	int l_capacity;
	HashMap<Integer, CacheEntry> l_HashMapmap;
	CacheEntry head, tail;
	// use two way linklist
	public LRUCache(int capacity) {
		l_capacity = capacity;
		l_HashMapmap = new HashMap<>(capacity);
		head = new CacheEntry(-1, -1);
		tail = new CacheEntry(1, 1);
		head.next = tail;
		tail.pre = head;
    }
    
    public int get(int key) {//if contains key, just get value and update
        if(l_HashMapmap.containsKey(key)){
        	CacheEntry cEntry = l_HashMapmap.get(key);
        	MoveToHead(cEntry);
        	return cEntry.value;
        }
        else return -1;
    }
    
    public void set(int key, int value) {
        if(l_HashMapmap.containsKey(key)){//if map contains key, just update value
        	CacheEntry cEntry = l_HashMapmap.get(key);
        	cEntry.value = value;
        	MoveToHead(cEntry);
        }else if(l_HashMapmap.size()<l_capacity){//not contain & smaller the size
			CacheEntry cEntry = new CacheEntry(key, value);
			MoveToHead(cEntry);
			l_HashMapmap.put(key, cEntry);
		}else {//not contain key and over the size
			CacheEntry cEntry = new CacheEntry(key, value);
			MoveToHead(cEntry);
			l_HashMapmap.put(key, cEntry);
			int endIndex = removeEnd();
			l_HashMapmap.remove(endIndex);
		}
    }
    
    private int removeEnd() {
		// TODO Auto-generated method stub
    	CacheEntry cEntry = tail.pre;
    	tail.pre.pre.next = tail;
    	tail.pre = cEntry.pre;
    	cEntry.pre = null;
    	cEntry.next = null;
		return cEntry.key;
	}

	private void MoveToHead(CacheEntry cEntry) {
		// TODO Auto-generated method stub
		if(cEntry.next!=null && cEntry.pre!=null){
			cEntry.pre.next = cEntry.next;
			cEntry.next.pre = cEntry.pre;
		}
		cEntry.pre = head;
		cEntry.next = head.next;
		head.next.pre = cEntry;
		head.next = cEntry;
	}
}

c++ 单向链表

class LRUCache{
public:
    struct CacheEntry{
        public:
            int key;
            int value;
            CacheEntry(int k, int v): key(k), value(v){}
    };

    LRUCache(int capacity){
        m_capacity = capacity;
    }

    int get(int key){
        if(m_map.find(key)==m_map.end())
            return -1;
        MoveToHead(key);
        return m_map[key]->value;
    }

    void set(int key, int value){
        if(m_map.find(key)==m_map.end()){
            CacheEntry newItem(key, value);
            if(m_LRU_cache.size()>=m_capacity){//remove from tail
                m_map.erase(m_LRU_cache.back().key);
                m_LRU_cache.pop_back();
            }
            //insert in head
            m_LRU_cache.push_front(newItem);
            m_map[key] = m_LRU_cache.begin();
            return;
        }
        m_map[key]->value = value;
        MoveToHead(key);
    }
private:
    int m_capacity;
    list<CacheEntry> m_LRU_cache;
    unordered_map<int, list<CacheEntry>::iterator> m_map;

    void MoveToHead(int key){
        auto updateEntry = *m_map[key];
        m_LRU_cache.erase(m_map[key]);
        m_LRU_cache.push_front(updateEntry);
        m_map[key] = m_LRU_cache.begin();
    }
};




最后

以上就是内向羊为你收集整理的[LeetCode]LRU Cache的全部内容,希望文章能够帮你解决[LeetCode]LRU Cache所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(45)

评论列表共有 0 条评论

立即
投稿
返回
顶部