summaryrefslogtreecommitdiff
path: root/README.txt
blob: 384102b1d5f6766d8a585a96cd246e14e3b84e0f (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93

PyLRU
=====

A least recently used (LRU) cache for Python.

Pylru implements a true LRU cache along with several support classes. The cache is efficient and written in pure Python. It works with Python 2.6+ including the new 3.x series. Basic operations (lookup, insert, delete) all run in a constant amount of time.


Usage
=====

You can install pylru, or you can just copy the source file pylru.py and use it in your own project.

An LRU cache object has a dictionary like interface and can be used in the same way::

    import pylru
    
    size = 100
    cache = pylru.lrucache(size)

    cache[key] = value  # Add a key/value pair
    key in cache        # Test for membership
    value = cache[key]  # Lookup a value given its key
    del cache[key]      # Remove a value given its key

    cache.size()        # Returns the size of the cache
    cache.size(x)       # Changes the size of the cache. x MUST be greater than
                        # zero.
                        
    x = len(cache)      # Returns the number of elements stored in the cache.
                        # x will be less than or equal to cache.size()                        
                        
    cache.clear()       # Remove all elements from the cache.
    
The lrucache takes an optional callback function as a second argument. Since the cache has a fixed size some operations, such as an insertion, may cause a key/value pair to be ejected. If the optional callback function is given it will be called when this occurs. For example::

    import pylru
    
    def callback(key, value):
        print (key, value)    # A dumb callback that just prints the key/value
    
    size = 100
    cache = pylru.lrucache(size, callback)
    
    # Use the cache... When it gets full some pairs may be ejected due to
    # the fixed cache size. But, not before the callback is called to let you
    # know.
    
Often a cache is used to speed up access to some other low latency object. If that object has a dictionary interface a convenience wrapper class provided by PyLRU can be used. This class takes as an argument the object you want to wrap and the cache size. It then creates an LRU cache for the object and automatically manages it. For example, imagine you have an object with a dictionary interface that reads/writes its values to and from a remote server. Let us call this object slowDict::

    import pylru
    
    size = 100
    cacheDict = pylru.lruwrap(slowDict, size)
    
    # Now cacheDict can be used just like slowDict, except all of the lookups
    # are automatically cached for you using an LRU cache.
    
By default lruwrap uses write-through semantics. For instance, in the above example insertions are updated in the cache and written through to slowDict immediatly. The cache and the underlying object are not allowed to get out of sync. So only lookup performace can be improved by the cache. lruwrap takes an optional third argument. If set to True write-back semantics will be used. Insertions will be updated to the cache. The underlying slowDict will automatically be updated only when a "dirty" key/value pair is ejected from the cache.

The programmer is responsible for one thing though. They MUST call sync() when they are finished. This ensures that the last of the "dirty" entries in the cache are written back::


    import pylru
    
    size = 100
    cacheDict = pylru.lruwrap(slowDict, size, True)
    
    # Now cacheDict can be used just like slowDict, except all of the lookups
    # are automatically cached for you using an LRU cache with Write-Back
    # semantics.

    # DON'T forget to call sync() when finished
    cacheDict.sync()
    
To help the programmer with this the lruwrap can be used in a with statement::
    
    with pylru.lruwrap(slowDict, size, True) as cacheDict
        
        # Use cacheDict, sync() is called automatically for you when leaving the
        # with statment block.


PyLRU also provides a function decorator::

    from pylru import lrudecorator

    @lrudecorator(100)
    def square(x):
        return x*x
        
    # Now results of the square function are cached for future lookup.