summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--Makefile10
-rw-r--r--docs/index.md2
-rw-r--r--src/tests/documentation.py739
3 files changed, 387 insertions, 364 deletions
diff --git a/Makefile b/Makefile
index 6d2cee1..4594571 100644
--- a/Makefile
+++ b/Makefile
@@ -1,11 +1,5 @@
-RST=$(S)/ms/tools/rst.py -H
-
-rst: src/tests/documentation.py
- PYTHONPATH=src:$(S) $(S)/ms/tools/minidoc.py -d tests.documentation
- cp /tmp/tests.documentation.rst docs
-
-html: /tmp/tests.documentation.rst
- sphinx-build docs docs/_build
+md: src/tests/documentation.py
+ python $(S)/ms/tools/py2md.py src/tests/documentation.py docs
upload: README.rst
python setup.py sdist bdist_wheel upload
diff --git a/docs/index.md b/docs/index.md
index 5efdee7..1849da5 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -1 +1 @@
-[Here](tests.documentation.rst)
+- [master](documentation.md)
diff --git a/src/tests/documentation.py b/src/tests/documentation.py
index 13d1415..2e6b473 100644
--- a/src/tests/documentation.py
+++ b/src/tests/documentation.py
@@ -12,18 +12,17 @@ except ImportError:
from decorator import (decorator, decorate, FunctionMaker, contextmanager,
dispatch_on, __version__)
-doc = r"""\
-The ``decorator`` module
-
-:Author: Michele Simionato
-:E-mail: michele.simionato@gmail.com
-:Version: $VERSION ($DATE)
-:Supports: Python 2.6, 2.7, 3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7
-:Download page: http://pypi.python.org/pypi/decorator/$VERSION
-:Installation: ``pip install decorator``
-:License: BSD license
-
-.. contents::
+doc = r"""The ``decorator`` module
+----------------------------------
+
+|Author | Michele Simionato|
+|---|---|
+|E-mail | michele.simionato@gmail.com|
+|Version| $VERSION ($DATE)|
+|Supports| Python 2.6, 2.7, 3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7|
+|Download page| http://pypi.python.org/pypi/decorator/$VERSION|
+|Installation| ``pip install decorator``|
+|License | BSD license|
Introduction
-----------------------------------------
@@ -40,14 +39,13 @@ decision made it possible to use a single code base both for Python
2,000 lines of duplicated documentation/doctests. Having to maintain
separate docs for Python 2 and Python 3 effectively stopped any
development on the module for several years. Moreover, it is now
-trivial to distribute the module as an universal wheel_ since 2to3 is no more
+trivial to distribute the module as an universal
+ [wheel](http://pythonwheels.com) since 2to3 is no more
required. Since Python 2.5 has been released ages ago (in 2006), I felt that
it was reasonable to drop the support for it. If you need to support
ancient versions of Python, stick with the decorator module version
3.4.2. The current version supports all Python releases from 2.6 up to 3.6.
-.. _wheel: http://pythonwheels.com/
-
What's New in version 4
-----------------------
@@ -180,8 +178,6 @@ Here is an example of usage:
$$f1
-.. _functools.update_wrapper: https://docs.python.org/3/library/functools.html#functools.update_wrapper
-
This works insofar as the decorator accepts functions with generic signatures.
Unfortunately, it is *not* a signature-preserving decorator, since
``memoize_uw`` generally returns a function with a *different signature*
@@ -195,11 +191,12 @@ Here, the original function takes a single argument named ``x``,
but the decorated function takes any number of arguments and
keyword arguments:
-.. code-block:: python
+```python
+>>> from decorator import getfullargspec
+>>> print(getfullargspec(f1))
+FullArgSpec(args=[], varargs='args', varkw='kw', defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
- >>> from decorator import getfullargspec
- >>> print(getfullargspec(f1))
- FullArgSpec(args=[], varargs='args', varkw='kw', defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
+```
This means that introspection tools (like ``pydoc``) will give false
information about the signature of ``f1`` -- unless you are using
@@ -207,12 +204,13 @@ Python 3.5. This is pretty bad: ``pydoc`` will tell you that the
function accepts the generic signature ``*args, **kw``, but
calling the function with more than one argument raises an error:
-.. code-block:: python
+```
+>>> f1(0, 1) # doctest: +IGNORE_EXCEPTION_DETAIL
+Traceback (most recent call last):
+ ...
+TypeError: f1() takes exactly 1 positional argument (2 given)
- >>> f1(0, 1) # doctest: +IGNORE_EXCEPTION_DETAIL
- Traceback (most recent call last):
- ...
- TypeError: f1() takes exactly 1 positional argument (2 given)
+```
Notice that ``inspect.getfullargspec``
will give the wrong signature, even in the latest Python, i.e. version 3.6
@@ -226,9 +224,10 @@ hides the complexity of making signature-preserving decorators
from the application programmer. The ``decorate`` function in
the ``decorator`` module is such a factory:
-.. code-block:: python
+```python
+>>> from decorator import decorate
- >>> from decorator import decorate
+```
``decorate`` takes two arguments:
@@ -253,25 +252,27 @@ function you want to decorate; there are no closures.
Here is a test of usage:
-.. code-block:: python
+```python
+>>> @memoize
+... def heavy_computation():
+... time.sleep(2)
+... return "done"
- >>> @memoize
- ... def heavy_computation():
- ... time.sleep(2)
- ... return "done"
+>>> print(heavy_computation()) # the first time it will take 2 seconds
+done
- >>> print(heavy_computation()) # the first time it will take 2 seconds
- done
+>>> print(heavy_computation()) # the second time it will be instantaneous
+done
- >>> print(heavy_computation()) # the second time it will be instantaneous
- done
+```
The signature of ``heavy_computation`` is the one you would expect:
-.. code-block:: python
+```python
+>>> print(getfullargspec(heavy_computation))
+FullArgSpec(args=[], varargs=None, varkw=None, defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
- >>> print(getfullargspec(heavy_computation))
- FullArgSpec(args=[], varargs=None, varkw=None, defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
+```
A ``trace`` decorator
------------------------------------------------------
@@ -285,40 +286,43 @@ $$trace
Here is an example of usage:
-.. code-block:: python
+```python
+>>> @trace
+... def f1(x):
+... pass
- >>> @trace
- ... def f1(x):
- ... pass
+```
It is immediate to verify that ``f1`` works...
-.. code-block:: python
+```python
+>>> f1(0)
+calling f1 with args (0,), {}
- >>> f1(0)
- calling f1 with args (0,), {}
+```
...and it that it has the correct signature:
-.. code-block:: python
+```python
+>>> print(getfullargspec(f1))
+FullArgSpec(args=['x'], varargs=None, varkw=None, defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
- >>> print(getfullargspec(f1))
- FullArgSpec(args=['x'], varargs=None, varkw=None, defaults=None, kwonlyargs=[], kwonlydefaults=None, annotations={})
+```
The decorator works with functions of any signature:
-.. code-block:: python
-
- >>> @trace
- ... def f(x, y=1, z=2, *args, **kw):
- ... pass
+```python
+>>> @trace
+... def f(x, y=1, z=2, *args, **kw):
+... pass
- >>> f(0, 3)
- calling f with args (0, 3, 2), {}
+>>> f(0, 3)
+calling f with args (0, 3, 2), {}
- >>> print(getfullargspec(f))
- FullArgSpec(args=['x', 'y', 'z'], varargs='args', varkw='kw', defaults=(1, 2), kwonlyargs=[], kwonlydefaults=None, annotations={})
+>>> print(getfullargspec(f))
+FullArgSpec(args=['x', 'y', 'z'], varargs='args', varkw='kw', defaults=(1, 2), kwonlyargs=[], kwonlydefaults=None, annotations={})
+```
$FUNCTION_ANNOTATIONS
``decorator.decorator``
@@ -332,12 +336,12 @@ to convert the caller function into a signature-preserving decorator.
It is the ``decorator`` function:
-.. code-block:: python
-
- >>> from decorator import decorator
- >>> print(decorator.__doc__)
- decorator(caller) converts a caller function into a decorator
+```python
+>>> from decorator import decorator
+>>> print(decorator.__doc__)
+decorator(caller) converts a caller function into a decorator
+```
The ``decorator`` function can be used as a signature-changing
decorator, just like ``classmethod`` and ``staticmethod``.
But ``classmethod`` and ``staticmethod`` return generic
@@ -346,52 +350,55 @@ signature-preserving decorators (i.e. functions with a single argument).
For instance, you can write:
-.. code-block:: python
+```python
+>>> @decorator
+... def trace(f, *args, **kw):
+... kwstr = ', '.join('%r: %r' % (k, kw[k]) for k in sorted(kw))
+... print("calling %s with args %s, {%s}" % (f.__name__, args, kwstr))
+... return f(*args, **kw)
- >>> @decorator
- ... def trace(f, *args, **kw):
- ... kwstr = ', '.join('%r: %r' % (k, kw[k]) for k in sorted(kw))
- ... print("calling %s with args %s, {%s}" % (f.__name__, args, kwstr))
- ... return f(*args, **kw)
+```
And ``trace`` is now a decorator!
-.. code-block:: python
+```python
+>>> trace # doctest: +ELLIPSIS
+<function trace at 0x...>
- >>> trace # doctest: +ELLIPSIS
- <function trace at 0x...>
+```
Here is an example of usage:
-.. code-block:: python
+```python
+>>> @trace
+... def func(): pass
- >>> @trace
- ... def func(): pass
+>>> func()
+calling func with args (), {}
- >>> func()
- calling func with args (), {}
+```
The `decorator` function can also be used to define factories of decorators,
i.e. functions returning decorators. In general you can just write something
like this:
-.. code-block:: python
-
- def decfactory(param1, param2, ...):
- def caller(f, *args, **kw):
- return somefunc(f, param1, param2, .., *args, **kw)
- return decorator(caller)
+```python
+def decfactory(param1, param2, ...):
+ def caller(f, *args, **kw):
+ return somefunc(f, param1, param2, .., *args, **kw)
+ return decorator(caller)
+```
This is fully general but requires an additional level of nesting. For this
reason since version 4.2 there is a facility to build
decorator factories by using a single caller with default arguments i.e.
writing something like this:
-.. code-block:: python
-
- def caller(f, param1=default1, param2=default2, ..., *args, **kw):
- return somefunc(f, param1, param2, *args, **kw)
- decfactory = decorator(caller)
+```python
+def caller(f, param1=default1, param2=default2, ..., *args, **kw):
+ return somefunc(f, param1, param2, *args, **kw)
+decfactory = decorator(caller)
+```
Notice that this simplified approach *only works with default arguments*,
i.e. `param1`, `param2` etc must have known defaults. Thanks to this
@@ -400,10 +407,10 @@ of the family which uses the default values for all parameters. Such
decorator can be written as ``decfactory()`` with no parameters specified;
moreover, as a shortcut, it is also possible to elide the parenthesis,
a feature much requested by the users. For years I have been opposite
-to this feature request, since having expliciti parenthesis to me is more clear
+to this feature request, since having explicit parenthesis to me is more clear
and less magic; however once this feature entered in decorators of
-the Python standard library (I am referring to the `dataclass` decorator
-https://www.python.org/dev/peps/pep-0557/) I finally gave up.
+the Python standard library (I am referring to the [dataclass decorator](
+https://www.python.org/dev/peps/pep-0557/)) I finally gave up.
The example below will show how it works in practice.
@@ -422,27 +429,28 @@ Functions decorated with ``blocking`` will return a busy message if
the resource is unavailable, and the intended result if the resource is
available. For instance:
-.. code-block:: python
+```python
+>>> @blocking("Please wait ...")
+... def read_data():
+... time.sleep(3) # simulate a blocking resource
+... return "some data"
- >>> @blocking("Please wait ...")
- ... def read_data():
- ... time.sleep(3) # simulate a blocking resource
- ... return "some data"
+>>> print(read_data()) # data is not available yet
+Please wait ...
- >>> print(read_data()) # data is not available yet
- Please wait ...
+>>> time.sleep(1)
+>>> print(read_data()) # data is not available yet
+Please wait ...
- >>> time.sleep(1)
- >>> print(read_data()) # data is not available yet
- Please wait ...
+>>> time.sleep(1)
+>>> print(read_data()) # data is not available yet
+Please wait ...
- >>> time.sleep(1)
- >>> print(read_data()) # data is not available yet
- Please wait ...
+>>> time.sleep(1.1) # after 3.1 seconds, data is available
+>>> print(read_data())
+some data
- >>> time.sleep(1.1) # after 3.1 seconds, data is available
- >>> print(read_data())
- some data
+```
Decorator factories are most useful to framework builders. Here is an example
that gives an idea of how you could manage permissions in a framework:
@@ -477,17 +485,18 @@ the final result.
Here is the minimalistic usage:
-.. code-block:: python
+```python
+>>> @decorator(Future)
+... def long_running(x):
+... time.sleep(.5)
+... return x
- >>> @decorator(Future)
- ... def long_running(x):
- ... time.sleep(.5)
- ... return x
+>>> fut1 = long_running(1)
+>>> fut2 = long_running(2)
+>>> fut1.result() + fut2.result()
+3
- >>> fut1 = long_running(1)
- >>> fut2 = long_running(2)
- >>> fut1.result() + fut2.result()
- 3
+```
contextmanager
-------------------------------------
@@ -496,27 +505,28 @@ Python's standard library has the ``contextmanager`` decorator,
which converts a generator function into a ``GeneratorContextManager``
factory. For instance, if you write this...
-.. code-block:: python
-
- >>> from contextlib import contextmanager
- >>> @contextmanager
- ... def before_after(before, after):
- ... print(before)
- ... yield
- ... print(after)
+```python
+>>> from contextlib import contextmanager
+>>> @contextmanager
+... def before_after(before, after):
+... print(before)
+... yield
+... print(after)
+```
...then ``before_after`` is a factory function that returns
``GeneratorContextManager`` objects, which provide the
use of the ``with`` statement:
-.. code-block:: python
+```python
+>>> with before_after('BEFORE', 'AFTER'):
+... print('hello')
+BEFORE
+hello
+AFTER
- >>> with before_after('BEFORE', 'AFTER'):
- ... print('hello')
- BEFORE
- hello
- AFTER
+```
Basically, it is as if the content of the ``with`` block was executed
in the place of the ``yield`` expression in the generator function.
@@ -524,16 +534,17 @@ in the place of the ``yield`` expression in the generator function.
In Python 3.2, ``GeneratorContextManager`` objects were enhanced with
a ``__call__`` method, so that they can be used as decorators, like so:
-.. code-block:: python
+```python
+>>> @ba # doctest: +SKIP
+... def hello():
+... print('hello')
+...
+>>> hello() # doctest: +SKIP
+BEFORE
+hello
+AFTER
- >>> @ba # doctest: +SKIP
- ... def hello():
- ... print('hello')
- ...
- >>> hello() # doctest: +SKIP
- BEFORE
- hello
- AFTER
+```
The ``ba`` decorator basically inserts a ``with ba:`` block
inside the function.
@@ -580,14 +591,15 @@ where the function is generated by ``exec``.
Here's an example:
-.. code-block:: python
+```python
+>>> def f(*args, **kw): # a function with a generic signature
+... print(args, kw)
- >>> def f(*args, **kw): # a function with a generic signature
- ... print(args, kw)
+>>> f1 = FunctionMaker.create('f1(a, b)', 'f(a, b)', dict(f=f))
+>>> f1(1,2)
+(1, 2) {}
- >>> f1 = FunctionMaker.create('f1(a, b)', 'f(a, b)', dict(f=f))
- >>> f1(1,2)
- (1, 2) {}
+```
It is important to notice that the function body is interpolated
before being executed; **be careful** with the ``%`` sign!
@@ -602,14 +614,15 @@ the source code of the generated function. To do this, just
pass ``addsource=True``, and the generated function will get
a ``__source__`` attribute:
-.. code-block:: python
+```python
+>>> f1 = FunctionMaker.create(
+... 'f1(a, b)', 'f(a, b)', dict(f=f), addsource=True)
+>>> print(f1.__source__)
+def f1(a, b):
+ f(a, b)
+<BLANKLINE>
- >>> f1 = FunctionMaker.create(
- ... 'f1(a, b)', 'f(a, b)', dict(f=f), addsource=True)
- >>> print(f1.__source__)
- def f1(a, b):
- f(a, b)
- <BLANKLINE>
+```
The first argument to ``FunctionMaker.create`` can be a string (as above),
or a function. This is the most common usage, since you typically decorate
@@ -641,13 +654,13 @@ Here is what happens:
(e.g., something like ``'f1(a, b=None)'``). Just pass ``'f1(a, b)'``,
followed by a tuple of defaults:
-.. code-block:: python
-
- >>> f1 = FunctionMaker.create(
- ... 'f1(a, b)', 'f(a, b)', dict(f=f), addsource=True, defaults=(None,))
- >>> print(getfullargspec(f1))
- FullArgSpec(args=['a', 'b'], varargs=None, varkw=None, defaults=(None,), kwonlyargs=[], kwonlydefaults=None, annotations={})
+```python
+>>> f1 = FunctionMaker.create(
+... 'f1(a, b)', 'f(a, b)', dict(f=f), addsource=True, defaults=(None,))
+>>> print(getfullargspec(f1))
+FullArgSpec(args=['a', 'b'], varargs=None, varkw=None, defaults=(None,), kwonlyargs=[], kwonlydefaults=None, annotations={})
+```
Getting the source code
---------------------------------------------------
@@ -666,33 +679,34 @@ not what you want:
$$identity_dec
$$example
-.. code-block:: python
+```python
+>>> import inspect
+>>> print(inspect.getsource(example))
+ def wrapper(*args, **kw):
+ return func(*args, **kw)
+<BLANKLINE>
- >>> import inspect
- >>> print(inspect.getsource(example))
- def wrapper(*args, **kw):
- return func(*args, **kw)
- <BLANKLINE>
+```
-(See bug report 1764286_ for an explanation of what is happening).
+(See bug report [1764286](http://bugs.python.org/issue1764286)
+for an explanation of what is happening).
Unfortunately the bug still exists in all versions of Python < 3.5.
However, there is a workaround. The decorated function has the ``__wrapped__``
attribute, pointing to the original function. The simplest way to get the
source code is to call ``inspect.getsource`` on the undecorated function:
-.. code-block:: python
-
- >>> print(inspect.getsource(factorial.__wrapped__))
- @tail_recursive
- def factorial(n, acc=1):
- "The good old factorial"
- if n == 0:
- return acc
- return factorial(n-1, n*acc)
- <BLANKLINE>
+```python
+>>> print(inspect.getsource(factorial.__wrapped__))
+@tail_recursive
+def factorial(n, acc=1):
+ "The good old factorial"
+ if n == 0:
+ return acc
+ return factorial(n-1, n*acc)
+<BLANKLINE>
-.. _1764286: http://bugs.python.org/issue1764286
+```
Dealing with third-party decorators
-----------------------------------------------------------------
@@ -736,10 +750,11 @@ Here is how you apply the upgraded decorator to the good old factorial:
$$factorial
-.. code-block:: python
+```python
+>>> print(factorial(4))
+24
- >>> print(factorial(4))
- 24
+```
This decorator is pretty impressive, and should give you some food for
thought! ;)
@@ -771,39 +786,41 @@ Here I will give a single example of usage. Suppose you want to log the moment
a coroutine starts and the moment it stops for debugging purposes. You could
write code like the following:
-.. code-block:: python
-
- import time
- import logging
- from asyncio import get_event_loop, sleep, wait
- from decorator import decorator
+```python
+import time
+import logging
+from asyncio import get_event_loop, sleep, wait
+from decorator import decorator
@decorator
- async def log_start_stop(coro, *args, **kwargs):
- logging.info('Starting %s%s', coro.__name__, args)
- t0 = time.time()
- await coro(*args, **kwargs)
- dt = time.time() - t0
- logging.info('Ending %s%s after %d seconds', coro.__name__, args, dt)
-
- @log_start_stop
- async def make_task(n):
- for i in range(n):
- await sleep(1)
-
- if __name__ == '__main__':
- logging.basicConfig(level=logging.INFO)
- tasks = [make_task(3), make_task(2), make_task(1)]
- get_event_loop().run_until_complete(wait(tasks))
-
-and you will get an output like this::
-
- INFO:root:Starting make_task(1,)
- INFO:root:Starting make_task(3,)
- INFO:root:Starting make_task(2,)
- INFO:root:Ending make_task(1,) after 1 seconds
- INFO:root:Ending make_task(2,) after 2 seconds
- INFO:root:Ending make_task(3,) after 3 seconds
+async def log_start_stop(coro, *args, **kwargs):
+ logging.info('Starting %s%s', coro.__name__, args)
+ t0 = time.time()
+ await coro(*args, **kwargs)
+ dt = time.time() - t0
+ logging.info('Ending %s%s after %d seconds', coro.__name__, args, dt)
+
+@log_start_stop
+async def make_task(n):
+ for i in range(n):
+ await sleep(1)
+
+if __name__ == '__main__':
+ logging.basicConfig(level=logging.INFO)
+ tasks = [make_task(3), make_task(2), make_task(1)]
+ get_event_loop().run_until_complete(wait(tasks))
+```
+
+and you will get an output like this:
+
+```bash
+INFO:root:Starting make_task(1,)
+INFO:root:Starting make_task(3,)
+INFO:root:Starting make_task(2,)
+INFO:root:Ending make_task(1,) after 1 seconds
+INFO:root:Ending make_task(2,) after 2 seconds
+INFO:root:Ending make_task(3,) after 3 seconds
+```
This may be handy if you have trouble understanding what it going on
with a particularly complex chain of coroutines. With a single line you
@@ -816,12 +833,12 @@ will return the right answer (i.e. ``True``).
It is also possible to define decorators converting coroutine functions
into regular functions, such as the following:
-.. code-block:: python
-
- @decorator
- def coro_to_func(coro, *args, **kw):
- "Convert a coroutine into a function"
+```python
+@decorator
+def coro_to_func(coro, *args, **kw):
+ "Convert a coroutine into a function"
return get_event_loop().run_until_complete(coro(*args, **kw))
+```
Notice the diffence: the caller in ``log_start_stop`` was a coroutine
function and the associate decorator was converting coroutines->coroutines;
@@ -872,11 +889,12 @@ $$writefloat
Now ``XMLWriter`` can serialize floats:
-.. code-block:: python
+```python
+>>> writer = XMLWriter()
+>>> writer.write(2.3)
+'<float>2.3</float>'
- >>> writer = XMLWriter()
- >>> writer.write(2.3)
- '<float>2.3</float>'
+```
I could give a down-to-earth example of situations in which it is desiderable
to dispatch on more than one argument--for instance, I once implemented
@@ -911,26 +929,27 @@ $$winRockScissors
Here is the result:
-.. code-block:: python
-
- >>> win(Paper(), Rock())
- 1
- >>> win(Scissors(), Paper())
- 1
- >>> win(Rock(), Scissors())
- 1
- >>> win(Paper(), Paper())
- 0
- >>> win(Rock(), Rock())
- 0
- >>> win(Scissors(), Scissors())
- 0
- >>> win(Rock(), Paper())
- -1
- >>> win(Paper(), Scissors())
- -1
- >>> win(Scissors(), Rock())
- -1
+```python
+>>> win(Paper(), Rock())
+1
+>>> win(Scissors(), Paper())
+1
+>>> win(Rock(), Scissors())
+1
+>>> win(Paper(), Paper())
+0
+>>> win(Rock(), Rock())
+0
+>>> win(Scissors(), Scissors())
+0
+>>> win(Rock(), Paper())
+-1
+>>> win(Paper(), Scissors())
+-1
+>>> win(Scissors(), Rock())
+-1
+
+```
The point of generic functions is that they play well with subclassing.
For instance, suppose we define a ``StrongRock``, which does not lose against
@@ -942,26 +961,27 @@ $$winStrongRockPaper
Then you do not need to define other implementations; they are
inherited from the parent:
-.. code-block:: python
+```python
+>>> win(StrongRock(), Scissors())
+1
- >>> win(StrongRock(), Scissors())
- 1
+```
You can introspect the precedence used by the dispath algorithm by
calling ``.dispatch_info(*types)``:
-.. code-block:: python
+```python
+>>> win.dispatch_info(StrongRock, Scissors)
+[('StrongRock', 'Scissors'), ('Rock', 'Scissors')]
- >>> win.dispatch_info(StrongRock, Scissors)
- [('StrongRock', 'Scissors'), ('Rock', 'Scissors')]
+```
Since there is no direct implementation for (``StrongRock``, ``Scissors``),
the dispatcher will look at the implementation for (``Rock``, ``Scissors``)
which is available. Internally, the algorithm is doing a cross
product of the class precedence lists (or *Method Resolution Orders*,
-MRO_ for short) of ``StrongRock`` and ``Scissors``, respectively.
-
-.. _MRO: http://www.python.org/2.3/mro.html
+[MRO](http://www.python.org/2.3/mro.html) for short) of ``StrongRock``
+ and ``Scissors``, respectively.
Generic functions and virtual ancestors
-------------------------------------------------
@@ -976,10 +996,11 @@ $$WithLength
This class defines a ``__len__`` method, and is therefore
considered to be a subclass of the abstract base class ``collections.Sized``:
-.. code-block:: python
+```python
+>>> issubclass(WithLength, collections.Sized)
+True
- >>> issubclass(WithLength, collections.Sized)
- True
+```
However, ``collections.Sized`` is not in the MRO_ of ``WithLength``; it
is not a true ancestor. Any implementation of generic functions (even
@@ -996,10 +1017,11 @@ $$get_length_sized
...then ``get_length`` must be defined on ``WithLength`` instances...
-.. code-block:: python
+```python
+>>> get_length(WithLength())
+0
- >>> get_length(WithLength())
- 0
+```
...even if ``collections.Sized`` is not a true ancestor of ``WithLength``.
@@ -1022,11 +1044,12 @@ Here, the author of ``SomeSet`` made a mistake by inheriting from
This is not a problem. You can register *a posteriori*
``collections.Set`` as a virtual ancestor of ``SomeSet``:
-.. code-block:: python
+```python
+>>> _ = collections.Set.register(SomeSet)
+>>> issubclass(SomeSet, collections.Set)
+True
- >>> _ = collections.Set.register(SomeSet)
- >>> issubclass(SomeSet, collections.Set)
- True
+```
Now, let's define an implementation of ``get_length`` specific to set:
@@ -1036,10 +1059,11 @@ The current implementation (and ``functools.singledispatch`` too)
is able to discern that a ``Set`` is a ``Sized`` object, by looking at
the class registry, so it uses the more specific implementation for ``Set``:
-.. code-block:: python
+```python
+>>> get_length(SomeSet()) # NB: the implementation for Sized would give 0
+1
- >>> get_length(SomeSet()) # NB: the implementation for Sized would give 0
- 1
+```
Sometimes it is not clear how to dispatch. For instance, consider a
class ``C`` registered both as ``collections.Iterable`` and
@@ -1089,11 +1113,12 @@ looking at the implementations. I will just notice that
``.dispatch_info`` is quite essential to see the class precedence
list used by algorithm:
-.. code-block:: python
+```python
+>>> g, V = singledispatch_example2()
+>>> g.dispatch_info(V)
+[('V',), ('Sized',), ('S',), ('Container',)]
- >>> g, V = singledispatch_example2()
- >>> g.dispatch_info(V)
- [('V',), ('Sized',), ('S',), ('Container',)]
+```
The current implementation does not implement any kind of cooperation
between implementations. In other words, nothing is akin either to
@@ -1108,8 +1133,7 @@ Caveats and limitations
One thing you should be aware of, is the performance penalty of decorators.
The worse case is shown by the following example:
-.. code-block:: bash
-
+```bash
$ cat performance.sh
python3 -m timeit -s "
from decorator import decorator
@@ -1128,13 +1152,15 @@ The worse case is shown by the following example:
pass
" "f()"
+```
On my laptop, using the ``do_nothing`` decorator instead of the
-plain function is five times slower::
+plain function is five times slower:
+```bash
$ bash performance.sh
1000000 loops, best of 3: 1.39 usec per loop
1000000 loops, best of 3: 0.278 usec per loop
-
+```
Of course, a real life function probably does something more useful
than the function ``f`` here, so the real life performance penalty
*could* be negligible. As always, the only way to know if there is a
@@ -1145,26 +1171,28 @@ tracebacks longer and more difficult to understand.
Consider this example:
-.. code-block:: python
+```python
+>>> @trace
+... def f():
+... 1/0
- >>> @trace
- ... def f():
- ... 1/0
+```
Calling ``f()`` gives you a ``ZeroDivisionError``.
But since the function is decorated, the traceback is longer:
-.. code-block:: python
+```python
+>>> f() # doctest: +ELLIPSIS
+Traceback (most recent call last):
+ ...
+ File "<string>", line 2, in f
+ File "<doctest __main__[22]>", line 4, in trace
+ return f(*args, **kw)
+ File "<doctest __main__[51]>", line 3, in f
+ 1/0
+ZeroDivisionError: ...
- >>> f() # doctest: +ELLIPSIS
- Traceback (most recent call last):
- ...
- File "<string>", line 2, in f
- File "<doctest __main__[22]>", line 4, in trace
- return f(*args, **kw)
- File "<doctest __main__[51]>", line 3, in f
- 1/0
- ZeroDivisionError: ...
+```
You see here the inner call to the decorator ``trace``, which calls
``f(*args, **kw)``, and a reference to ``File "<string>", line 2, in f``.
@@ -1193,8 +1221,6 @@ signature internally, as you can see by using
``inspect.getfullargspec`` - which has been rightly deprecated -
will see the wrong signature.
-.. _362: http://www.python.org/dev/peps/pep-0362
-
In the present implementation, decorators generated by ``decorator``
can only be used on user-defined Python functions or methods.
They cannot be used on generic callable objects or built-in functions,
@@ -1203,9 +1229,8 @@ for Python 2. In Python 3.5, many such limitations have been removed, but
I still think that it is cleaner and safer to decorate only
functions. If you want to decorate things like classmethods/staticmethods
and general callables - which I will never support in the decorator module -
-I suggest you to look at the wrapt_ project by Graeme Dumpleton.
-
-.. _wrapt: https://wrapt.readthedocs.io/en/latest/
+I suggest you to look at the [wrapt](https://wrapt.readthedocs.io/en/latest/)
+project by Graeme Dumpleton.
There is a strange quirk when decorating functions with keyword
arguments, if one of the arguments has the same name used in the
@@ -1214,15 +1239,17 @@ David Goldstein.
Here is an example where it is manifest:
-.. code-block:: python
+```python
+>>> @memoize
+... def getkeys(**kw):
+... return kw.keys()
- >>> @memoize
- ... def getkeys(**kw):
- ... return kw.keys()
- >>> getkeys(func='a') # doctest: +ELLIPSIS
- Traceback (most recent call last):
- ...
- TypeError: _memoize() got multiple values for ... 'func'
+>>> getkeys(func='a') # doctest: +ELLIPSIS
+Traceback (most recent call last):
+ ...
+TypeError: _memoize() got multiple values for ... 'func'
+
+```
The error message looks really strange... until you realize that
the caller function `_memoize` uses `func` as first argument,
@@ -1232,19 +1259,20 @@ keywork arguments.
The solution is to change the name of the first argument in `_memoize`,
or to change the implementation like so:
-.. code-block:: python
+```python
- def _memoize(*all_args, **kw):
- func = all_args[0]
- args = all_args[1:]
- if kw: # frozenset is used to ensure hashability
- key = args, frozenset(kw.items())
- else:
- key = args
- cache = func.cache # attribute added by memoize
- if key not in cache:
- cache[key] = func(*args, **kw)
- return cache[key]
+def _memoize(*all_args, **kw):
+ func = all_args[0]
+ args = all_args[1:]
+ if kw: # frozenset is used to ensure hashability
+ key = args, frozenset(kw.items())
+ else:
+ key = args
+ cache = func.cache # attribute added by memoize
+ if key not in cache:
+ cache[key] = func(*args, **kw)
+ return cache[key]
+```
This avoids the need to name the first argument, so the problem
simply disappears. This is a technique that you should keep in mind
@@ -1256,37 +1284,35 @@ that.
On a similar note, there is a restriction on argument names. For instance,
if you name an argument ``_call_`` or ``_func_``, you will get a ``NameError``:
-.. code-block:: python
+```python
+>>> @trace
+... def f(_func_): print(f)
+...
+Traceback (most recent call last):
+ ...
+NameError: _func_ is overridden in
+def f(_func_):
+ return _call_(_func_, _func_)
- >>> @trace
- ... def f(_func_): print(f)
- ...
- Traceback (most recent call last):
- ...
- NameError: _func_ is overridden in
- def f(_func_):
- return _call_(_func_, _func_)
+```
Finally, the implementation is such that the decorated function makes
a (shallow) copy of the original function dictionary:
-.. code-block:: python
-
- >>> def f(): pass # the original function
- >>> f.attr1 = "something" # setting an attribute
- >>> f.attr2 = "something else" # setting another attribute
+```python
+>>> def f(): pass # the original function
+>>> f.attr1 = "something" # setting an attribute
+>>> f.attr2 = "something else" # setting another attribute
- >>> traced_f = trace(f) # the decorated function
+>>> traced_f = trace(f) # the decorated function
- >>> traced_f.attr1
- 'something'
- >>> traced_f.attr2 = "something different" # setting attr
- >>> f.attr2 # the original attribute did not change
- 'something else'
+>>> traced_f.attr1
+'something'
+>>> traced_f.attr2 = "something different" # setting attr
+>>> f.attr2 # the original attribute did not change
+'something else'
-.. _function annotations: http://www.python.org/dev/peps/pep-3107/
-.. _docutils: http://docutils.sourceforge.net/
-.. _pygments: http://pygments.org/
+```
LICENSE (2-clause BSD)
---------------------------------------------
@@ -1333,39 +1359,42 @@ stored in a dictionary named ``__annotations__``. The ``decorator`` module
Here is an example:
-.. code-block:: python
+```python
+>>> @trace
+... def f(x: 'the first argument', y: 'default argument'=1, z=2,
+... *args: 'varargs', **kw: 'kwargs'):
+... pass
- >>> @trace
- ... def f(x: 'the first argument', y: 'default argument'=1, z=2,
- ... *args: 'varargs', **kw: 'kwargs'):
- ... pass
+```
In order to introspect functions with annotations, one needs the
utility ``inspect.getfullargspec`` (introduced in Python 3, then
deprecated in Python 3.5, then undeprecated in Python 3.6):
-.. code-block:: python
-
- >>> from inspect import getfullargspec
- >>> argspec = getfullargspec(f)
- >>> argspec.args
- ['x', 'y', 'z']
- >>> argspec.varargs
- 'args'
- >>> argspec.varkw
- 'kw'
- >>> argspec.defaults
- (1, 2)
- >>> argspec.kwonlyargs
- []
- >>> argspec.kwonlydefaults
+```python
+>>> from inspect import getfullargspec
+>>> argspec = getfullargspec(f)
+>>> argspec.args
+['x', 'y', 'z']
+>>> argspec.varargs
+'args'
+>>> argspec.varkw
+'kw'
+>>> argspec.defaults
+(1, 2)
+>>> argspec.kwonlyargs
+[]
+>>> argspec.kwonlydefaults
+
+```
You can check that the ``__annotations__`` dictionary is preserved:
-.. code-block:: python
+```python
+>>> f.__annotations__ is f.__wrapped__.__annotations__
+True
- >>> f.__annotations__ is f.__wrapped__.__annotations__
- True
+```
Here ``f.__wrapped__`` is the original undecorated function.
This attribute exists for consistency with the behavior of