Source

python-peps / pep-3124.txt

Full commit
   1
   2
   3
   4
   5
   6
   7
   8
   9
  10
  11
  12
  13
  14
  15
  16
  17
  18
  19
  20
  21
  22
  23
  24
  25
  26
  27
  28
  29
  30
  31
  32
  33
  34
  35
  36
  37
  38
  39
  40
  41
  42
  43
  44
  45
  46
  47
  48
  49
  50
  51
  52
  53
  54
  55
  56
  57
  58
  59
  60
  61
  62
  63
  64
  65
  66
  67
  68
  69
  70
  71
  72
  73
  74
  75
  76
  77
  78
  79
  80
  81
  82
  83
  84
  85
  86
  87
  88
  89
  90
  91
  92
  93
  94
  95
  96
  97
  98
  99
 100
 101
 102
 103
 104
 105
 106
 107
 108
 109
 110
 111
 112
 113
 114
 115
 116
 117
 118
 119
 120
 121
 122
 123
 124
 125
 126
 127
 128
 129
 130
 131
 132
 133
 134
 135
 136
 137
 138
 139
 140
 141
 142
 143
 144
 145
 146
 147
 148
 149
 150
 151
 152
 153
 154
 155
 156
 157
 158
 159
 160
 161
 162
 163
 164
 165
 166
 167
 168
 169
 170
 171
 172
 173
 174
 175
 176
 177
 178
 179
 180
 181
 182
 183
 184
 185
 186
 187
 188
 189
 190
 191
 192
 193
 194
 195
 196
 197
 198
 199
 200
 201
 202
 203
 204
 205
 206
 207
 208
 209
 210
 211
 212
 213
 214
 215
 216
 217
 218
 219
 220
 221
 222
 223
 224
 225
 226
 227
 228
 229
 230
 231
 232
 233
 234
 235
 236
 237
 238
 239
 240
 241
 242
 243
 244
 245
 246
 247
 248
 249
 250
 251
 252
 253
 254
 255
 256
 257
 258
 259
 260
 261
 262
 263
 264
 265
 266
 267
 268
 269
 270
 271
 272
 273
 274
 275
 276
 277
 278
 279
 280
 281
 282
 283
 284
 285
 286
 287
 288
 289
 290
 291
 292
 293
 294
 295
 296
 297
 298
 299
 300
 301
 302
 303
 304
 305
 306
 307
 308
 309
 310
 311
 312
 313
 314
 315
 316
 317
 318
 319
 320
 321
 322
 323
 324
 325
 326
 327
 328
 329
 330
 331
 332
 333
 334
 335
 336
 337
 338
 339
 340
 341
 342
 343
 344
 345
 346
 347
 348
 349
 350
 351
 352
 353
 354
 355
 356
 357
 358
 359
 360
 361
 362
 363
 364
 365
 366
 367
 368
 369
 370
 371
 372
 373
 374
 375
 376
 377
 378
 379
 380
 381
 382
 383
 384
 385
 386
 387
 388
 389
 390
 391
 392
 393
 394
 395
 396
 397
 398
 399
 400
 401
 402
 403
 404
 405
 406
 407
 408
 409
 410
 411
 412
 413
 414
 415
 416
 417
 418
 419
 420
 421
 422
 423
 424
 425
 426
 427
 428
 429
 430
 431
 432
 433
 434
 435
 436
 437
 438
 439
 440
 441
 442
 443
 444
 445
 446
 447
 448
 449
 450
 451
 452
 453
 454
 455
 456
 457
 458
 459
 460
 461
 462
 463
 464
 465
 466
 467
 468
 469
 470
 471
 472
 473
 474
 475
 476
 477
 478
 479
 480
 481
 482
 483
 484
 485
 486
 487
 488
 489
 490
 491
 492
 493
 494
 495
 496
 497
 498
 499
 500
 501
 502
 503
 504
 505
 506
 507
 508
 509
 510
 511
 512
 513
 514
 515
 516
 517
 518
 519
 520
 521
 522
 523
 524
 525
 526
 527
 528
 529
 530
 531
 532
 533
 534
 535
 536
 537
 538
 539
 540
 541
 542
 543
 544
 545
 546
 547
 548
 549
 550
 551
 552
 553
 554
 555
 556
 557
 558
 559
 560
 561
 562
 563
 564
 565
 566
 567
 568
 569
 570
 571
 572
 573
 574
 575
 576
 577
 578
 579
 580
 581
 582
 583
 584
 585
 586
 587
 588
 589
 590
 591
 592
 593
 594
 595
 596
 597
 598
 599
 600
 601
 602
 603
 604
 605
 606
 607
 608
 609
 610
 611
 612
 613
 614
 615
 616
 617
 618
 619
 620
 621
 622
 623
 624
 625
 626
 627
 628
 629
 630
 631
 632
 633
 634
 635
 636
 637
 638
 639
 640
 641
 642
 643
 644
 645
 646
 647
 648
 649
 650
 651
 652
 653
 654
 655
 656
 657
 658
 659
 660
 661
 662
 663
 664
 665
 666
 667
 668
 669
 670
 671
 672
 673
 674
 675
 676
 677
 678
 679
 680
 681
 682
 683
 684
 685
 686
 687
 688
 689
 690
 691
 692
 693
 694
 695
 696
 697
 698
 699
 700
 701
 702
 703
 704
 705
 706
 707
 708
 709
 710
 711
 712
 713
 714
 715
 716
 717
 718
 719
 720
 721
 722
 723
 724
 725
 726
 727
 728
 729
 730
 731
 732
 733
 734
 735
 736
 737
 738
 739
 740
 741
 742
 743
 744
 745
 746
 747
 748
 749
 750
 751
 752
 753
 754
 755
 756
 757
 758
 759
 760
 761
 762
 763
 764
 765
 766
 767
 768
 769
 770
 771
 772
 773
 774
 775
 776
 777
 778
 779
 780
 781
 782
 783
 784
 785
 786
 787
 788
 789
 790
 791
 792
 793
 794
 795
 796
 797
 798
 799
 800
 801
 802
 803
 804
 805
 806
 807
 808
 809
 810
 811
 812
 813
 814
 815
 816
 817
 818
 819
 820
 821
 822
 823
 824
 825
 826
 827
 828
 829
 830
 831
 832
 833
 834
 835
 836
 837
 838
 839
 840
 841
 842
 843
 844
 845
 846
 847
 848
 849
 850
 851
 852
 853
 854
 855
 856
 857
 858
 859
 860
 861
 862
 863
 864
 865
 866
 867
 868
 869
 870
 871
 872
 873
 874
 875
 876
 877
 878
 879
 880
 881
 882
 883
 884
 885
 886
 887
 888
 889
 890
 891
 892
 893
 894
 895
 896
 897
 898
 899
 900
 901
 902
 903
 904
 905
 906
 907
 908
 909
 910
 911
 912
 913
 914
 915
 916
 917
 918
 919
 920
 921
 922
 923
 924
 925
 926
 927
 928
 929
 930
 931
 932
 933
 934
 935
 936
 937
 938
 939
 940
 941
 942
 943
 944
 945
 946
 947
 948
 949
 950
 951
 952
 953
 954
 955
 956
 957
 958
 959
 960
 961
 962
 963
 964
 965
 966
 967
 968
 969
 970
 971
 972
 973
 974
 975
 976
 977
 978
 979
 980
 981
 982
 983
 984
 985
 986
 987
 988
 989
 990
 991
 992
 993
 994
 995
 996
 997
 998
 999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
PEP: 3124
Title: Overloading, Generic Functions, Interfaces, and Adaptation
Version: $Revision$
Last-Modified: $Date$
Author: Phillip J. Eby <pje@telecommunity.com>
Discussions-To: Python 3000 List <python-3000@python.org>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Requires: 3107, 3115, 3119
Created: 28-Apr-2007
Post-History: 30-Apr-2007
Replaces: 245, 246


Deferred
========

See http://mail.python.org/pipermail/python-3000/2007-July/008784.html.


Abstract
========

This PEP proposes a new standard library module, ``overloading``, to
provide generic programming features including dynamic overloading
(aka generic functions), interfaces, adaptation, method combining (ala
CLOS and AspectJ), and simple forms of aspect-oriented programming
(AOP).

The proposed API is also open to extension; that is, it will be
possible for library developers to implement their own specialized
interface types, generic function dispatchers, method combination
algorithms, etc., and those extensions will be treated as first-class
citizens by the proposed API.

The API will be implemented in pure Python with no C, but may have
some dependency on CPython-specific features such as ``sys._getframe``
and the ``func_code`` attribute of functions.  It is expected that
e.g. Jython and IronPython will have other ways of implementing
similar functionality (perhaps using Java or C#).


Rationale and Goals
===================

Python has always provided a variety of built-in and standard-library
generic functions, such as ``len()``, ``iter()``, ``pprint.pprint()``,
and most of the functions in the ``operator`` module.  However, it
currently:

1. does not have a simple or straightforward way for developers to
   create new generic functions,

2. does not have a standard way for methods to be added to existing
   generic functions (i.e., some are added using registration
   functions, others require defining ``__special__`` methods,
   possibly by monkeypatching), and

3. does not allow dispatching on multiple argument types (except in
   a limited form for arithmetic operators, where "right-hand"
   (``__r*__``) methods can be used to do two-argument dispatch.

In addition, it is currently a common anti-pattern for Python code
to inspect the types of received arguments, in order to decide what
to do with the objects.  For example, code may wish to accept either
an object of some type, or a sequence of objects of that type.

Currently, the "obvious way" to do this is by type inspection, but
this is brittle and closed to extension.  A developer using an
already-written library may be unable to change how their objects are
treated by such code, especially if the objects they are using were
created by a third party.

Therefore, this PEP proposes a standard library module to address
these, and related issues, using decorators and argument annotations
(PEP 3107).  The primary features to be provided are:

* a dynamic overloading facility, similar to the static overloading
  found in languages such as Java and C++, but including optional
  method combination features as found in CLOS and AspectJ.

* a simple "interfaces and adaptation" library inspired by Haskell's
  typeclasses (but more dynamic, and without any static type-checking),
  with an extension API to allow registering user-defined interface
  types such as those found in PyProtocols and Zope.

* a simple "aspect" implementation to make it easy to create stateful
  adapters and to do other stateful AOP.

These features are to be provided in such a way that extended
implementations can be created and used.  For example, it should be
possible for libraries to define new dispatching criteria for
generic functions, and new kinds of interfaces, and use them in
place of the predefined features.  For example, it should be possible
to use a ``zope.interface`` interface object to specify the desired
type of a function argument, as long as the ``zope.interface`` package
registered itself correctly (or a third party did the registration).

In this way, the proposed API simply offers a uniform way of accessing
the functionality within its scope, rather than prescribing a single
implementation to be used for all libraries, frameworks, and
applications.


User API
========

The overloading API will be implemented as a single module, named
``overloading``, providing the following features:


Overloading/Generic Functions
-----------------------------

The ``@overload`` decorator allows you to define alternate
implementations of a function, specialized by argument type(s).  A
function with the same name must already exist in the local namespace.
The existing function is modified in-place by the decorator to add
the new implementation, and the modified function is returned by the
decorator.  Thus, the following code::

    from overloading import overload
    from collections import Iterable
    
    def flatten(ob):
        """Flatten an object to its component iterables"""
        yield ob

    @overload
    def flatten(ob: Iterable):
        for o in ob:
            for ob in flatten(o):
                yield ob

    @overload
    def flatten(ob: basestring):
        yield ob

creates a single ``flatten()`` function whose implementation roughly
equates to::

    def flatten(ob):
        if isinstance(ob, basestring) or not isinstance(ob, Iterable):
            yield ob
        else:
            for o in ob:
                for ob in flatten(o):
                    yield ob

**except** that the ``flatten()`` function defined by overloading
remains open to extension by adding more overloads, while the
hardcoded version cannot be extended.

For example, if someone wants to use ``flatten()`` with a string-like
type that doesn't subclass ``basestring``, they would be out of luck
with the second implementation.  With the overloaded implementation,
however, they can either write this::

    @overload
    def flatten(ob: MyString):
        yield ob

or this (to avoid copying the implementation)::

    from overloading import RuleSet
    RuleSet(flatten).copy_rules((basestring,), (MyString,))

(Note also that, although PEP 3119 proposes that it should be possible
for abstract base classes like ``Iterable`` to allow classes like
``MyString`` to claim subclass-hood, such a claim is *global*,
throughout the application.  In contrast, adding a specific overload
or copying a rule is specific to an individual function, and therefore
less likely to have undesired side effects.)


``@overload`` vs. ``@when``
~~~~~~~~~~~~~~~~~~~~~~~~~~~

The ``@overload`` decorator is a common-case shorthand for the more
general ``@when`` decorator.  It allows you to leave out the name of
the function you are overloading, at the expense of requiring the
target function to be in the local namespace.  It also doesn't support
adding additional criteria besides the ones specified via argument
annotations.  The following function definitions have identical
effects, except for name binding side-effects (which will be described
below)::

    from overloading import when

    @overload
    def flatten(ob: basestring):
        yield ob

    @when(flatten)
    def flatten(ob: basestring):
        yield ob

    @when(flatten)
    def flatten_basestring(ob: basestring):
        yield ob

    @when(flatten, (basestring,))
    def flatten_basestring(ob):
        yield ob

The first definition above will bind ``flatten`` to whatever it was
previously bound to.  The second will do the same, if it was already
bound to the ``when`` decorator's first argument.  If ``flatten`` is
unbound or bound to something else, it will be rebound to the function
definition as given.  The last two definitions above will always bind
``flatten_basestring`` to the function definition as given.

Using this approach allows you to both give a method a descriptive
name (often useful in tracebacks!) and to reuse the method later.

Except as otherwise specified, all ``overloading`` decorators have the
same signature and binding rules as ``@when``.  They accept a function
and an optional "predicate" object.

The default predicate implementation is a tuple of types with
positional matching to the overloaded function's arguments.  However,
an arbitrary number of other kinds of of predicates can be created and
registered using the `Extension API`_, and will then be usable with
``@when`` and other decorators created by this module (like
``@before``, ``@after``, and ``@around``).

 
Method Combination and Overriding
---------------------------------

When an overloaded function is invoked, the implementation with the
signature that *most specifically matches* the calling arguments is
the one used.  If no implementation matches, a ``NoApplicableMethods``
error is raised.  If more than one implementation matches, but none of
the signatures are more specific than the others, an ``AmbiguousMethods``
error is raised.

For example, the following pair of implementations are ambiguous, if
the ``foo()`` function is ever called with two integer arguments,
because both signatures would apply, but neither signature is more
*specific* than the other (i.e., neither implies the other)::

    def foo(bar:int, baz:object):
        pass

    @overload
    def foo(bar:object, baz:int):
        pass

In contrast, the following pair of implementations can never be
ambiguous, because one signature always implies the other; the
``int/int`` signature is more specific than the ``object/object``
signature::

    def foo(bar:object, baz:object):
        pass

    @overload
    def foo(bar:int, baz:int):
        pass

A signature S1 implies another signature S2, if whenever S1 would
apply, S2 would also.  A signature S1 is "more specific" than another
signature S2, if S1 implies S2, but S2 does not imply S1.

Although the examples above have all used concrete or abstract types
as argument annotations, there is no requirement that the annotations
be such.  They can also be "interface" objects (discussed in the
`Interfaces and Adaptation`_ section), including user-defined
interface types.  (They can also be other objects whose types are
appropriately registered via  the `Extension API`_.)


Proceeding to the "Next" Method
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

If the first parameter of an overloaded function is named
``__proceed__``, it will be passed a callable representing the next
most-specific method.  For example, this code::

    def foo(bar:object, baz:object):
        print "got objects!"

    @overload
    def foo(__proceed__, bar:int, baz:int):
        print "got integers!"
        return __proceed__(bar, baz)

Will print "got integers!" followed by "got objects!".

If there is no next most-specific method, ``__proceed__`` will be
bound to a ``NoApplicableMethods`` instance.  When called, a new
``NoApplicableMethods`` instance will be raised, with the arguments
passed to the first instance.

Similarly, if the next most-specific methods have ambiguous precedence
with respect to each other, ``__proceed__`` will be bound to an
``AmbiguousMethods`` instance, and if called, it will raise a new
instance.

Thus, a method can either check if ``__proceed__`` is an error
instance, or simply invoke it.  The ``NoApplicableMethods`` and
``AmbiguousMethods`` error classes have a common ``DispatchError``
base class, so ``isinstance(__proceed__, overloading.DispatchError)``
is sufficient to identify whether ``__proceed__`` can be safely
called.

(Implementation note: using a magic argument name like ``__proceed__``
could potentially be replaced by a magic function that would be called
to obtain the next method.  A magic function, however, would degrade
performance and might be more difficult to implement on non-CPython
platforms.  Method chaining via magic argument names, however, can be
efficiently implemented on any Python platform that supports creating
bound methods from functions -- one simply recursively binds each
function to be chained, using the following function or error as the
``im_self`` of the bound method.)


"Before" and "After" Methods
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

In addition to the simple next-method chaining shown above, it is
sometimes useful to have other ways of combining methods.  For
example, the "observer pattern" can sometimes be implemented by adding
extra methods to a function, that execute before or after the normal
implementation.

To support these use cases, the ``overloading`` module will supply
``@before``, ``@after``, and ``@around`` decorators, that roughly
correspond to the same types of methods in the Common Lisp Object
System (CLOS), or the corresponding "advice" types in AspectJ.

Like ``@when``, all of these decorators must be passed the function to
be overloaded, and can optionally accept a predicate as well::

    from overloading import before, after

    def begin_transaction(db):
        print "Beginning the actual transaction"

    @before(begin_transaction)
    def check_single_access(db: SingletonDB):
        if db.inuse:
            raise TransactionError("Database already in use")

    @after(begin_transaction)
    def start_logging(db: LoggableDB):
        db.set_log_level(VERBOSE)


``@before`` and ``@after`` methods are invoked either before or after
the main function body, and are *never considered ambiguous*.  That
is, it will not cause any errors to have multiple "before" or "after"
methods with identical or overlapping signatures.  Ambiguities are
resolved using the order in which the methods were added to the
target function.  

"Before" methods are invoked most-specific method first, with
ambiguous methods being executed in the order they were added.  All
"before" methods are called before any of the function's "primary"
methods (i.e. normal ``@overload`` methods) are executed.

"After" methods are invoked in the *reverse* order, after all of the
function's "primary" methods are executed.  That is, they are executed
least-specific methods first, with ambiguous methods being executed in
the reverse of the order in which they were added.

The return values of both "before" and "after" methods are ignored,
and any uncaught exceptions raised by *any* methods (primary or other)
immediately end the dispatching process.  "Before" and "after" methods
cannot have ``__proceed__`` arguments, as they are not responsible
for calling any other methods.  They are simply called as a
notification before or after the primary methods.

Thus, "before" and "after" methods can be used to check or establish
preconditions (e.g. by raising an error if the conditions aren't met)
or to ensure postconditions, without needing to duplicate any existing
functionality.


"Around" Methods
~~~~~~~~~~~~~~~~

The ``@around`` decorator declares a method as an "around" method.
"Around" methods are much like primary methods, except that the
least-specific "around" method has higher precedence than the
most-specific "before" method.

Unlike "before" and "after" methods, however, "Around" methods *are*
responsible for calling their ``__proceed__`` argument, in order to
continue the invocation process.  "Around" methods are usually used
to transform input arguments or return values, or to wrap specific
cases with special error handling or try/finally conditions, e.g.::

    from overloading import around

    @around(commit_transaction)
    def lock_while_committing(__proceed__, db: SingletonDB):
        with db.global_lock:
            return __proceed__(db)

They can also be used to replace the normal handling for a specific
case, by *not* invoking the ``__proceed__`` function.

The ``__proceed__`` given to an "around" method will either be the
next applicable "around" method, a ``DispatchError`` instance,
or a synthetic method object that will call all the "before" methods,
followed by the primary method chain, followed by all the "after"
methods, and return the result from the primary method chain.

Thus, just as with normal methods, ``__proceed__`` can be checked for
``DispatchError``-ness, or simply invoked.  The "around" method should
return the value returned by ``__proceed__``, unless of course it
wishes to modify or replace it with a different return value for the
function as a whole.


Custom Combinations
~~~~~~~~~~~~~~~~~~~

The decorators described above (``@overload``, ``@when``, ``@before``,
``@after``, and ``@around``) collectively implement what in CLOS is
called the "standard method combination" -- the most common patterns
used in combining methods.

Sometimes, however, an application or library may have use for a more
sophisticated type of method combination.  For example, if you
would like to have "discount" methods that return a percentage off,
to be subtracted from the value returned by the primary method(s),
you might write something like this::

    from overloading import always_overrides, merge_by_default
    from overloading import Around, Before, After, Method, MethodList

    class Discount(MethodList):
        """Apply return values as discounts"""
    
        def __call__(self, *args, **kw):
            retval = self.tail(*args, **kw)
            for sig, body in self.sorted():
                retval -= retval * body(*args, **kw)
            return retval

    # merge discounts by priority
    merge_by_default(Discount)

    # discounts have precedence over before/after/primary methods
    always_overrides(Discount, Before)
    always_overrides(Discount, After)
    always_overrides(Discount, Method)

    # but not over "around" methods
    always_overrides(Around, Discount)

    # Make a decorator called "discount" that works just like the 
    # standard decorators...
    discount = Discount.make_decorator('discount')

    # and now let's use it...
    def price(product):
        return product.list_price

    @discount(price)
    def ten_percent_off_shoes(product: Shoe)
        return Decimal('0.1')

Similar techniques can be used to implement a wide variety of
CLOS-style method qualifiers and combination rules.  The process of
creating custom method combination objects and their corresponding
decorators is described in more detail under the `Extension API`_
section.

Note, by the way, that the ``@discount`` decorator shown will work
correctly with any new predicates defined by other code.  For example,
if ``zope.interface`` were to register its interface types to work
correctly as argument annotations, you would be able to specify
discounts on the basis of its interface types, not just classes or
``overloading``-defined interface types.

Similarly, if a library like RuleDispatch or PEAK-Rules were to
register an appropriate predicate implementation and dispatch engine,
one would then be able to use those predicates for discounts as well,
e.g.::

    from somewhere import Pred  # some predicate implementation
    
    @discount(
        price,
        Pred("isinstance(product,Shoe) and"
             " product.material.name=='Blue Suede'")
    )
    def forty_off_blue_suede_shoes(product):
        return Decimal('0.4')

The process of defining custom predicate types and dispatching engines
is also described in more detail under the `Extension API`_ section.


Overloading Inside Classes
--------------------------

All of the decorators above have a special additional behavior when
they are directly invoked within a class body: the first parameter
(other than ``__proceed__``, if present) of the decorated function
will be treated as though it had an annotation equal to the class
in which it was defined.

That is, this code::

    class And(object):
        # ...
        @when(get_conjuncts)
        def __conjuncts(self):
            return self.conjuncts

produces the same effect as this (apart from the existence of a
private method)::

    class And(object):
        # ...

    @when(get_conjuncts)
    def get_conjuncts_of_and(ob: And):
        return ob.conjuncts    

This behavior is both a convenience enhancement when defining lots of
methods, and a requirement for safely distinguishing multi-argument
overloads in subclasses.  Consider, for example, the following code::
        
    class A(object):
        def foo(self, ob):
            print "got an object"

        @overload
        def foo(__proceed__, self, ob:Iterable):
            print "it's iterable!"
            return __proceed__(self, ob)


    class B(A):
        foo = A.foo     # foo must be defined in local namespace

        @overload
        def foo(__proceed__, self, ob:Iterable):
            print "B got an iterable!"
            return __proceed__(self, ob)

Due to the implicit class rule, calling ``B().foo([])`` will print
"B got an iterable!" followed by "it's iterable!", and finally,
"got an object", while ``A().foo([])`` would print only the messages
defined in ``A``.

Conversely, without the implicit class rule, the two "Iterable"
methods would have the exact same applicability conditions, so calling
either ``A().foo([])`` or ``B().foo([])`` would result in an
``AmbiguousMethods`` error.

It is currently an open issue to determine the best way to implement
this rule in Python 3.0.  Under Python 2.x, a class' metaclass was
not chosen until the end of the class body, which means that
decorators could insert a custom metaclass to do processing of this
sort.  (This is how RuleDispatch, for example, implements the implicit
class rule.)

PEP 3115, however, requires that a class' metaclass be determined
*before* the class body has executed, making it impossible to use this
technique for class decoration any more.

At this writing, discussion on this issue is ongoing.


Interfaces and Adaptation
-------------------------

The ``overloading`` module provides a simple implementation of
interfaces and adaptation.  The following example defines an
``IStack`` interface, and declares that ``list`` objects support it::

    from overloading import abstract, Interface

    class IStack(Interface):
        @abstract
        def push(self, ob)
            """Push 'ob' onto the stack"""

        @abstract
        def pop(self):
            """Pop a value and return it"""


    when(IStack.push, (list, object))(list.append)
    when(IStack.pop, (list,))(list.pop)

    mylist = []
    mystack = IStack(mylist)
    mystack.push(42)
    assert mystack.pop()==42

The ``Interface`` class is a kind of "universal adapter".  It accepts
a single argument: an object to adapt.  It then binds all its methods
to the target object, in place of itself.  Thus, calling
``mystack.push(42``) is the same as calling
``IStack.push(mylist, 42)``.

The ``@abstract`` decorator marks a function as being abstract: i.e.,
having no implementation.  If an ``@abstract`` function is called,
it raises ``NoApplicableMethods``.  To become executable, overloaded
methods must be added using the techniques previously described. (That
is, methods can be added using ``@when``, ``@before``, ``@after``,
``@around``, or any custom method combination decorators.)

In the example above, the ``list.append`` method is added as a method
for ``IStack.push()`` when its arguments are a list and an arbitrary
object.  Thus, ``IStack.push(mylist, 42)`` is translated to
``list.append(mylist, 42)``, thereby implementing the desired
operation.


Abstract and Concrete Methods
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Note, by the way, that the ``@abstract`` decorator is not limited to
use in interface definitions; it can be used anywhere that you wish to
create an "empty" generic function that initially has no methods.  In
particular, it need not be used inside a class.

Also note that interface methods need not be abstract; one could, for
example, write an interface like this::

    class IWriteMapping(Interface):
        @abstract
        def __setitem__(self, key, value):
            """This has to be implemented"""

        def update(self, other:IReadMapping):
            for k, v in IReadMapping(other).items():
                self[k] = v

As long as ``__setitem__`` is defined for some type, the above
interface will provide a usable ``update()`` implementation.  However,
if some specific type (or pair of types) has a more efficient way of
handling ``update()`` operations, an appropriate overload can still
be registered for use in that case.


Subclassing and Re-assembly
~~~~~~~~~~~~~~~~~~~~~~~~~~~

Interfaces can be subclassed::

    class ISizedStack(IStack):
        @abstract
        def __len__(self):
            """Return the number of items on the stack"""

    # define __len__ support for ISizedStack
    when(ISizedStack.__len__, (list,))(list.__len__)

Or assembled by combining functions from existing interfaces::

    class Sizable(Interface):
        __len__ = ISizedStack.__len__

    # list now implements Sizable as well as ISizedStack, without
    # making any new declarations!

A class can be considered to "adapt to" an interface at a given
point in time, if no method defined in the interface is guaranteed to
raise a ``NoApplicableMethods`` error if invoked on an instance of
that class at that point in time.

In normal usage, however, it is "easier to ask forgiveness than
permission".  That is, it is easier to simply use an interface on
an object by adapting it to the interface (e.g. ``IStack(mylist)``)
or invoking interface methods directly (e.g. ``IStack.push(mylist,
42)``), than to try to figure out whether the object is adaptable to
(or directly implements) the interface.


Implementing an Interface in a Class
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

It is possible to declare that a class directly implements an
interface, using the ``declare_implementation()`` function::

    from overloading import declare_implementation

    class Stack(object):
        def __init__(self):
            self.data = []
        def push(self, ob):
            self.data.append(ob)
        def pop(self):
            return self.data.pop()

    declare_implementation(IStack, Stack)

The ``declare_implementation()`` call above is roughly equivalent to
the following steps::

    when(IStack.push, (Stack,object))(lambda self, ob: self.push(ob))
    when(IStack.pop, (Stack,))(lambda self, ob: self.pop())

That is, calling ``IStack.push()`` or ``IStack.pop()`` on an instance
of any subclass of ``Stack``, will simply delegate to the actual
``push()`` or ``pop()`` methods thereof.

For the sake of efficiency, calling ``IStack(s)`` where ``s`` is an
instance of ``Stack``, **may** return ``s`` rather than an ``IStack``
adapter.  (Note that calling ``IStack(x)`` where ``x`` is already an
``IStack`` adapter will always return ``x`` unchanged; this is an
additional optimization allowed in cases where the adaptee is known
to *directly* implement the interface, without adaptation.)

For convenience, it may be useful to declare implementations in the
class header, e.g.::

    class Stack(metaclass=Implementer, implements=IStack):
        ...

Instead of calling ``declare_implementation()`` after the end of the
suite.


Interfaces as Type Specifiers
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

``Interface`` subclasses can be used as argument annotations to
indicate what type of objects are acceptable to an overload, e.g.::

    @overload
    def traverse(g: IGraph, s: IStack):
        g = IGraph(g)
        s = IStack(s)
        # etc....

Note, however, that the actual arguments are *not* changed or adapted
in any way by the mere use of an interface as a type specifier.  You
must explicitly cast the objects to the appropriate interface, as
shown above.

Note, however, that other patterns of interface use are possible.
For example, other interface implementations might not support
adaptation, or might require that function arguments already be
adapted to the specified interface.  So the exact semantics of using
an interface as a type specifier are dependent on the interface
objects you actually use.

For the interface objects defined by this PEP, however, the semantics
are as described above.  An interface I1 is considered "more specific"
than another interface I2, if the set of descriptors in I1's
inheritance hierarchy are a proper superset of the descriptors in I2's
inheritance hierarchy.

So, for example, ``ISizedStack`` is more specific than both
``ISizable`` and ``ISizedStack``, irrespective of the inheritance
relationships between these interfaces.  It is purely a question of
what operations are included within those interfaces -- and the
*names* of the operations are unimportant.

Interfaces (at least the ones provided by ``overloading``) are always
considered less-specific than concrete classes.  Other interface
implementations can decide on their own specificity rules, both
between interfaces and other interfaces, and between interfaces and
classes.


Non-Method Attributes in Interfaces
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The ``Interface`` implementation actually treats all attributes and
methods (i.e. descriptors) in the same way: their ``__get__`` (and
``__set__`` and ``__delete__``, if present) methods are called with
the wrapped (adapted) object as "self".  For functions, this has the
effect of creating a bound method linking the generic function to the
wrapped object.

For non-function attributes, it may be easiest to specify them using
the ``property`` built-in, and the corresponding ``fget``, ``fset``,
and ``fdel`` attributes::

    class ILength(Interface):
        @property
        @abstract
        def length(self):
            """Read-only length attribute"""

    # ILength(aList).length == list.__len__(aList) 
    when(ILength.length.fget, (list,))(list.__len__)
    

Alternatively, methods such as ``_get_foo()`` and ``_set_foo()``
may be defined as part of the interface, and the property defined
in terms of those methods, but this is a bit more difficult for users
to implement correctly when creating a class that directly implements
the interface, as they would then need to match all the individual
method names, not just the name of the property or attribute.


Aspects
-------

The adaptation system described above assumes that adapters are "stateless",
which is to say that adapters have no attributes or state apart from
that of the adapted object.  This follows the "typeclass/instance"
model of Haskell, and the concept of "pure" (i.e., transitively
composable) adapters.

However, there are occasionally cases where, to provide a complete
implementation of some interface, some sort of additional state is
required.

One possibility of course, would be to attach monkeypatched "private"
attributes to the adaptee.  But this is subject to name collisions,
and complicates the process of initialization (since any code using
these attributes has to check for their existence and initialize them
if necessary).  It also doesn't work on objects that don't have a
``__dict__`` attribute.

So the ``Aspect`` class is provided to make it easy to attach extra
information to objects that either:

1. have a ``__dict__`` attribute (so aspect instances can be stored
   in it, keyed by aspect class),

2. support weak referencing (so aspect instances can be managed using
   a global but thread-safe weak-reference dictionary), or

3. implement or can be adapt to the ``overloading.IAspectOwner``
   interface (technically, #1 or #2 imply this).

Subclassing ``Aspect`` creates an adapter class whose state is tied
to the life of the adapted object.

For example, suppose you would like to count all the times a certain
method is called on instances of ``Target`` (a classic AOP example).
You might do something like::

    from overloading import Aspect

    class Count(Aspect):
        count = 0

    @after(Target.some_method)
    def count_after_call(self:Target, *args, **kw):
        Count(self).count += 1

The above code will keep track of the number of times that
``Target.some_method()`` is successfully called on an instance of
``Target`` (i.e., it will not count errors unless they occur in a
more-specific "after" method).  Other code can then access the count
using ``Count(someTarget).count``.

``Aspect`` instances can of course have ``__init__`` methods, to
initialize any data structures.  They can use either ``__slots__``
or dictionary-based attributes for storage.

While this facility is rather primitive compared to a full-featured
AOP tool like AspectJ, persons who wish to build pointcut libraries
or other AspectJ-like features can certainly use ``Aspect`` objects
and method-combination decorators as a base for building more
expressive AOP tools.

XXX spec out full aspect API, including keys, N-to-1 aspects, manual
    attach/detach/delete of aspect instances, and the ``IAspectOwner``
    interface.
    
    
Extension API
=============

TODO: explain how all of these work

implies(o1, o2)

declare_implementation(iface, class)

predicate_signatures(ob)

parse_rule(ruleset, body, predicate, actiontype, localdict, globaldict)

combine_actions(a1, a2)

rules_for(f)

Rule objects

ActionDef objects

RuleSet objects

Method objects

MethodList objects

IAspectOwner


Overloading Usage Patterns
==========================

In discussion on the Python-3000 list, the proposed feature of allowing
arbitrary functions to be overloaded has been somewhat controversial,
with some people expressing concern that this would make programs more
difficult to understand.

The general thrust of this argument is that one cannot rely on what a
function does, if it can be changed from anywhere in the program at any
time.  Even though in principle this can already happen through
monkeypatching or code substitution, it is considered poor practice to
do so.

However, providing support for overloading any function (or so the
argument goes), is implicitly blessing such changes as being an
acceptable practice.

This argument appears to make sense in theory, but it is almost entirely
mooted in practice for two reasons.

First, people are generally not perverse, defining a function to do one
thing in one place, and then summarily defining it to do the opposite
somewhere else!  The principal reasons to extend the behavior of a
function that has *not* been specifically made generic are to:

* Add special cases not contemplated by the original function's author,
  such as support for additional types.

* Be notified of an action in order to cause some related operation to
  be performed, either before the original operation is performed,
  after it, or both.  This can include general-purpose operations like
  adding logging, timing, or tracing, as well as application-specific
  behavior.

None of these reasons for adding overloads imply any change to the
intended default or overall behavior of the existing function, however.
Just as a base class method may be overridden by a subclass for these
same two reasons, so too may a function be overloaded to provide for
such enhancements.

In other words, universal overloading does not equal *arbitrary*
overloading, in the sense that we need not expect people to randomly
redefine the behavior of existing functions in illogical or
unpredictable ways.  If they did so, it would be no less of a bad
practice than any other way of writing illogical or unpredictable code!

However, to distinguish bad practice from good, it is perhaps necessary
to clarify further what good practice for defining overloads *is*.  And
that brings us to the second reason why generic functions do not
necessarily make programs harder to understand: overloading patterns in
actual programs tend to follow very predictable patterns.  (Both in
Python and in languages that have no *non*-generic functions.)

If a module is defining a new generic operation, it will usually also
define any required overloads for existing types in the same place.
Likewise, if a module is defining a new type, then it will usually
define overloads there for any generic functions that it knows or cares
about.

As a result, the vast majority of overloads can be found adjacent to
either the function being overloaded, or to a newly-defined type for
which the overload is adding support.  Thus, overloads are highly-
discoverable in the common case, as you are either looking at the
function or the type, or both.

It is only in rather infrequent cases that one will have overloads in a
module that contains neither the function nor the type(s) for which the
overload is added.  This would be the case if, say, a third-party
created a bridge of support between one library's types and another
library's generic function(s).  In such a case, however, best practice
suggests prominently advertising this, especially by way of the module
name.

For example, PyProtocols defines such bridge support for working with
Zope interfaces and legacy Twisted interfaces, using modules called
``protocols.twisted_support`` and ``protocols.zope_support``.  (These
bridges are done with interface adapters, rather than generic functions,
but the basic principle is the same.)

In short, understanding programs in the presence of universal
overloading need not be any more difficult, given that the vast majority
of overloads will either be adjacent to a function, or the definition of
a type that is passed to that function.

And, in the absence of incompetence or deliberate intention to be
obscure, the few overloads that are not adjacent to the relevant type(s)
or function(s), will generally not need to be understood or known about
outside the scope where those overloads are defined.  (Except in the
"support modules" case, where best practice suggests naming them
accordingly.)


Implementation Notes
====================

Most of the functionality described in this PEP is already implemented
in the in-development version of the PEAK-Rules framework.  In
particular, the basic overloading and method combination framework
(minus the ``@overload`` decorator) already exists there.  The
implementation of all of these features in ``peak.rules.core`` is 656
lines of Python at this writing.

``peak.rules.core`` currently relies on the DecoratorTools and
BytecodeAssembler modules, but both of these dependencies can be
replaced, as DecoratorTools is used mainly for Python 2.3
compatibility and to implement structure types (which can be done
with named tuples in later versions of Python).  The use of
BytecodeAssembler can be replaced using an "exec" or "compile"
workaround, given a reasonable effort.  (It would be easier to do this
if the ``func_closure`` attribute of function objects was writable.)

The ``Interface`` class has been previously prototyped, but is not
included in PEAK-Rules at the present time.

The "implicit class rule" has previously been implemented in the
RuleDispatch library.  However, it relies on the ``__metaclass__``
hook that is currently eliminated in PEP 3115.

I don't currently know how to make ``@overload`` play nicely with
``classmethod`` and ``staticmethod`` in class bodies.  It's not really
clear if it needs to, however.


Copyright
=========

This document has been placed in the public domain.



..
   Local Variables:
   mode: indented-text
   indent-tabs-mode: nil
   sentence-end-double-space: t
   fill-column: 70
   coding: utf-8
   End: