Collection Configuration and Techniques

The relationship() function defines a linkage between two classes.When the linkage defines a one-to-many or many-to-many relationship, it’srepresented as a Python collection when objects are loaded and manipulated.This section presents additional information about collection configurationand techniques.

Working with Large Collections

The default behavior of relationship() is to fully loadthe collection of items in, as according to the loading strategy of therelationship. Additionally, the Session by default only knows how to deleteobjects which are actually present within the session. When a parent instanceis marked for deletion and flushed, the Session loads its full list of childitems in so that they may either be deleted as well, or have their foreign keyvalue set to null; this is to avoid constraint violations. For largecollections of child items, there are several strategies to bypass fullloading of child items both at load time as well as deletion time.

Dynamic Relationship Loaders

A key feature to enable management of a large collection is the so-called “dynamic”relationship. This is an optional form of relationship() whichreturns a Query object in place of a collectionwhen accessed. filter() criterion may beapplied as well as limits and offsets, either explicitly or via array slices:

  1. class User(Base):
  2. __tablename__ = 'user'
  3.  
  4. posts = relationship(Post, lazy="dynamic")
  5.  
  6. jack = session.query(User).get(id)
  7.  
  8. # filter Jack's blog posts
  9. posts = jack.posts.filter(Post.headline=='this is a post')
  10.  
  11. # apply array slices
  12. posts = jack.posts[5:20]

The dynamic relationship supports limited write operations, via theappend() and remove() methods:

  1. oldpost = jack.posts.filter(Post.headline=='old post').one()
  2. jack.posts.remove(oldpost)
  3.  
  4. jack.posts.append(Post('new post'))

Since the read side of the dynamic relationship always queries thedatabase, changes to the underlying collection will not be visibleuntil the data has been flushed. However, as long as “autoflush” isenabled on the Session in use, this will occurautomatically each time the collection is about to emit aquery.

To place a dynamic relationship on a backref, use the backref()function in conjunction with lazy='dynamic':

  1. class Post(Base):
  2. __table__ = posts_table
  3.  
  4. user = relationship(User,
  5. backref=backref('posts', lazy='dynamic')
  6. )

Note that eager/lazy loading options cannot be used in conjunction dynamic relationships at this time.

Note

The dynamic_loader() function is essentially the sameas relationship() with the lazy='dynamic' argument specified.

Warning

The “dynamic” loader applies to collections only. It is not validto use “dynamic” loaders with many-to-one, one-to-one, or uselist=Falserelationships. Newer versions of SQLAlchemy emit warnings or exceptionsin these cases.

Setting Noload, RaiseLoad

A “noload” relationship never loads from the database, even whenaccessed. It is configured using lazy='noload':

  1. class MyClass(Base):
  2. __tablename__ = 'some_table'
  3.  
  4. children = relationship(MyOtherClass, lazy='noload')

Above, the children collection is fully writeable, and changes to it willbe persisted to the database as well as locally available for reading at thetime they are added. However when instances of MyClass are freshly loadedfrom the database, the children collection stays empty. The noloadstrategy is also available on a query option basis using theorm.noload() loader option.

Alternatively, a “raise”-loaded relationship will raise anInvalidRequestError where the attribute would normallyemit a lazy load:

  1. class MyClass(Base):
  2. __tablename__ = 'some_table'
  3.  
  4. children = relationship(MyOtherClass, lazy='raise')

Above, attribute access on the children collection will raise an exceptionif it was not previously eagerloaded. This includes read access but forcollections will also affect write access, as collections can’t be mutatedwithout first loading them. The rationale for this is to ensure that anapplication is not emitting any unexpected lazy loads within a certain context.Rather than having to read through SQL logs to determine that all necessaryattributes were eager loaded, the “raise” strategy will cause unloadedattributes to raise immediately if accessed. The raise strategy isalso available on a query option basis using the orm.raiseload()loader option.

New in version 1.1: added the “raise” loader strategy.

See also

Preventing unwanted lazy loads using raiseload

Using Passive Deletes

Use passive_deletes to disable child object loading on a DELETEoperation, in conjunction with “ON DELETE (CASCADE|SET NULL)” on your databaseto automatically cascade deletes to child objects:

  1. class MyClass(Base):
  2. __tablename__ = 'mytable'
  3. id = Column(Integer, primary_key=True)
  4. children = relationship("MyOtherClass",
  5. cascade="all, delete-orphan",
  6. passive_deletes=True)
  7.  
  8. class MyOtherClass(Base):
  9. __tablename__ = 'myothertable'
  10. id = Column(Integer, primary_key=True)
  11. parent_id = Column(Integer,
  12. ForeignKey('mytable.id', ondelete='CASCADE')
  13. )

Note

To use “ON DELETE CASCADE”, the underlying database engine mustsupport foreign keys.

When passive_deletes is applied, the children relationship will not beloaded into memory when an instance of MyClass is marked for deletion. Thecascade="all, delete-orphan" will take effect for instances ofMyOtherClass which are currently present in the session; however forinstances of MyOtherClass which are not loaded, SQLAlchemy assumes that“ON DELETE CASCADE” rules will ensure that those rows are deleted by thedatabase.

See also

orm.mapper.passive_deletes - similar feature on mapper()

Customizing Collection Access

Mapping a one-to-many or many-to-many relationship results in a collection ofvalues accessible through an attribute on the parent instance. By default,this collection is a list:

  1. class Parent(Base):
  2. __tablename__ = 'parent'
  3. parent_id = Column(Integer, primary_key=True)
  4.  
  5. children = relationship(Child)
  6.  
  7. parent = Parent()
  8. parent.children.append(Child())
  9. print(parent.children[0])

Collections are not limited to lists. Sets, mutable sequences and almost anyother Python object that can act as a container can be used in place of thedefault list, by specifying the collection_class option onrelationship():

  1. class Parent(Base):
  2. __tablename__ = 'parent'
  3. parent_id = Column(Integer, primary_key=True)
  4.  
  5. # use a set
  6. children = relationship(Child, collection_class=set)
  7.  
  8. parent = Parent()
  9. child = Child()
  10. parent.children.add(child)
  11. assert child in parent.children

Dictionary Collections

A little extra detail is needed when using a dictionary as a collection.This because objects are always loaded from the database as lists, and a key-generationstrategy must be available to populate the dictionary correctly. Theattribute_mapped_collection() function is by far the most common wayto achieve a simple dictionary collection. It produces a dictionary class that will apply a particular attributeof the mapped class as a key. Below we map an Item class containinga dictionary of Note items keyed to the Note.keyword attribute:

  1. from sqlalchemy import Column, Integer, String, ForeignKey
  2. from sqlalchemy.orm import relationship
  3. from sqlalchemy.orm.collections import attribute_mapped_collection
  4. from sqlalchemy.ext.declarative import declarative_base
  5.  
  6. Base = declarative_base()
  7.  
  8. class Item(Base):
  9. __tablename__ = 'item'
  10. id = Column(Integer, primary_key=True)
  11. notes = relationship("Note",
  12. collection_class=attribute_mapped_collection('keyword'),
  13. cascade="all, delete-orphan")
  14.  
  15. class Note(Base):
  16. __tablename__ = 'note'
  17. id = Column(Integer, primary_key=True)
  18. item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
  19. keyword = Column(String)
  20. text = Column(String)
  21.  
  22. def __init__(self, keyword, text):
  23. self.keyword = keyword
  24. self.text = text

Item.notes is then a dictionary:

  1. >>> item = Item()
  2. >>> item.notes['a'] = Note('a', 'atext')
  3. >>> item.notes.items()
  4. {'a': <__main__.Note object at 0x2eaaf0>}

attribute_mapped_collection() will ensure thatthe .keyword attribute of each Note complies with the key in thedictionary. Such as, when assigning to Item.notes, the dictionarykey we supply must match that of the actual Note object:

  1. item = Item()
  2. item.notes = {
  3. 'a': Note('a', 'atext'),
  4. 'b': Note('b', 'btext')
  5. }

The attribute which attribute_mapped_collection() uses as a keydoes not need to be mapped at all! Using a regular Python @property allows virtuallyany detail or combination of details about the object to be used as the key, asbelow when we establish it as a tuple of Note.keyword and the first ten lettersof the Note.text field:

  1. class Item(Base):
  2. __tablename__ = 'item'
  3. id = Column(Integer, primary_key=True)
  4. notes = relationship("Note",
  5. collection_class=attribute_mapped_collection('note_key'),
  6. backref="item",
  7. cascade="all, delete-orphan")
  8.  
  9. class Note(Base):
  10. __tablename__ = 'note'
  11. id = Column(Integer, primary_key=True)
  12. item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
  13. keyword = Column(String)
  14. text = Column(String)
  15.  
  16. @property
  17. def note_key(self):
  18. return (self.keyword, self.text[0:10])
  19.  
  20. def __init__(self, keyword, text):
  21. self.keyword = keyword
  22. self.text = text

Above we added a Note.item backref. Assigning to this reverse relationship, the Noteis added to the Item.notes dictionary and the key is generated for us automatically:

  1. >>> item = Item()
  2. >>> n1 = Note("a", "atext")
  3. >>> n1.item = item
  4. >>> item.notes
  5. {('a', 'atext'): <__main__.Note object at 0x2eaaf0>}

Other built-in dictionary types include column_mapped_collection(),which is almost like attribute_mapped_collection() except given the Columnobject directly:

  1. from sqlalchemy.orm.collections import column_mapped_collection
  2.  
  3. class Item(Base):
  4. __tablename__ = 'item'
  5. id = Column(Integer, primary_key=True)
  6. notes = relationship("Note",
  7. collection_class=column_mapped_collection(Note.__table__.c.keyword),
  8. cascade="all, delete-orphan")

as well as mapped_collection() which is passed any callable function.Note that it’s usually easier to use attribute_mapped_collection() alongwith a @property as mentioned earlier:

  1. from sqlalchemy.orm.collections import mapped_collection
  2.  
  3. class Item(Base):
  4. __tablename__ = 'item'
  5. id = Column(Integer, primary_key=True)
  6. notes = relationship("Note",
  7. collection_class=mapped_collection(lambda note: note.text[0:10]),
  8. cascade="all, delete-orphan")

Dictionary mappings are often combined with the “Association Proxy” extension to producestreamlined dictionary views. See Proxying to Dictionary Based Collections and Composite Association Proxiesfor examples.

  • sqlalchemy.orm.collections.attributemapped_collection(_attr_name)
  • A dictionary-based collection type with attribute-based keying.

Returns a MappedCollection factory with a keying based on the‘attr_name’ attribute of entities in the collection, where attr_nameis the string name of the attribute.

The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.

  • sqlalchemy.orm.collections.columnmapped_collection(_mapping_spec)
  • A dictionary-based collection type with column-based keying.

Returns a MappedCollection factory with a keying functiongenerated from mapping_spec, which may be a Column or a sequenceof Columns.

The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.

  • sqlalchemy.orm.collections.mappedcollection(_keyfunc)
  • A dictionary-based collection type with arbitrary keying.

Returns a MappedCollection factory with a keying functiongenerated from keyfunc, a callable that takes an entity and returns akey value.

The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.

Custom Collection Implementations

You can use your own types for collections as well. In simple cases,inheriting from list or set, adding custom behavior, is all that’s needed.In other cases, special decorators are needed to tell SQLAlchemy more detailabout how the collection operates.

Do I need a custom collection implementation?

In most cases not at all! The most common use cases for a “custom” collectionis one that validates or marshals incoming values into a new form, such asa string that becomes a class instance, or one which goes astep beyond and represents the data internally in some fashion, presentinga “view” of that data on the outside of a different form.

For the first use case, the orm.validates() decorator is by farthe simplest way to intercept incoming values in all cases for the purposesof validation and simple marshaling. See Simple Validatorsfor an example of this.

For the second use case, the Association Proxy extension is awell-tested, widely used system that provides a read/write “view” of acollection in terms of some attribute present on the target object. As thetarget attribute can be a @property that returns virtually anything, awide array of “alternative” views of a collection can be constructed withjust a few functions. This approach leaves the underlying mapped collectionunaffected and avoids the need to carefully tailor collection behavior on amethod-by-method basis.

Customized collections are useful when the collection needs tohave special behaviors upon access or mutation operations that can’totherwise be modeled externally to the collection. They can of coursebe combined with the above two approaches.

Collections in SQLAlchemy are transparently instrumented. Instrumentationmeans that normal operations on the collection are tracked and result inchanges being written to the database at flush time. Additionally, collectionoperations can fire events which indicate some secondary operation must takeplace. Examples of a secondary operation include saving the child item in theparent’s Session (i.e. the save-updatecascade), as well as synchronizing the state of a bi-directional relationship(i.e. a backref()).

The collections package understands the basic interface of lists, sets anddicts and will automatically apply instrumentation to those built-in types andtheir subclasses. Object-derived types that implement a basic collectioninterface are detected and instrumented via duck-typing:

  1. class ListLike(object):
  2. def __init__(self):
  3. self.data = []
  4. def append(self, item):
  5. self.data.append(item)
  6. def remove(self, item):
  7. self.data.remove(item)
  8. def extend(self, items):
  9. self.data.extend(items)
  10. def __iter__(self):
  11. return iter(self.data)
  12. def foo(self):
  13. return 'foo'

append, remove, and extend are known list-like methods, and willbe instrumented automatically. iter is not a mutator method and won’tbe instrumented, and foo won’t be either.

Duck-typing (i.e. guesswork) isn’t rock-solid, of course, so you can beexplicit about the interface you are implementing by providing anemulates class attribute:

  1. class SetLike(object):
  2. __emulates__ = set
  3.  
  4. def __init__(self):
  5. self.data = set()
  6. def append(self, item):
  7. self.data.add(item)
  8. def remove(self, item):
  9. self.data.remove(item)
  10. def __iter__(self):
  11. return iter(self.data)

This class looks list-like because of append, but emulates forcesit to set-like. remove is known to be part of the set interface and willbe instrumented.

But this class won’t work quite yet: a little glue is needed to adapt it foruse by SQLAlchemy. The ORM needs to know which methods to use to append,remove and iterate over members of the collection. When using a type likelist or set, the appropriate methods are well-known and usedautomatically when present. This set-like class does not provide the expectedadd method, so we must supply an explicit mapping for the ORM via adecorator.

Annotating Custom Collections via Decorators

Decorators can be used to tag the individual methods the ORM needs to managecollections. Use them when your class doesn’t quite meet the regular interfacefor its container type, or when you otherwise would like to use a different method toget the job done.

  1. from sqlalchemy.orm.collections import collection
  2.  
  3. class SetLike(object):
  4. __emulates__ = set
  5.  
  6. def __init__(self):
  7. self.data = set()
  8.  
  9. @collection.appender
  10. def append(self, item):
  11. self.data.add(item)
  12.  
  13. def remove(self, item):
  14. self.data.remove(item)
  15.  
  16. def __iter__(self):
  17. return iter(self.data)

And that’s all that’s needed to complete the example. SQLAlchemy will addinstances via the append method. remove and iter are thedefault methods for sets and will be used for removing and iteration. Defaultmethods can be changed as well:

  1. from sqlalchemy.orm.collections import collection
  2.  
  3. class MyList(list):
  4. @collection.remover
  5. def zark(self, item):
  6. # do something special...
  7.  
  8. @collection.iterator
  9. def hey_use_this_instead_for_iteration(self):
  10. # ...

There is no requirement to be list-, or set-like at all. Collection classescan be any shape, so long as they have the append, remove and iterateinterface marked for SQLAlchemy’s use. Append and remove methods will becalled with a mapped entity as the single argument, and iterator methods arecalled with no arguments and must return an iterator.

  • class sqlalchemy.orm.collections.collection
  • Decorators for entity collection classes.

The decorators fall into two groups: annotations and interception recipes.

The annotating decorators (appender, remover, iterator, linker, converter,internally_instrumented) indicate the method’s purpose and take noarguments. They are not written with parens:

  1. @collection.appenderdef append(self, append):

The recipe decorators all require parens, even those that take noarguments:

  1. @collection.adds('entity')def insert(self, position, entity):

  2. @collection.removes_return()def popitem(self):

  • static adds(arg)
  • Mark the method as adding an entity to the collection.

Adds “add to collection” handling to the method. The decoratorargument indicates which method argument holds the SQLAlchemy-relevantvalue. Arguments can be specified positionally (i.e. integer) or byname:

  1. @collection.adds(1)def push(self, item):

  2. @collection.adds('entity')def do_stuff(self, thing, entity=None):

  • static appender(fn)
  • Tag the method as the collection appender.

The appender method is called with one positional argument: the valueto append. The method will be automatically decorated with ‘adds(1)’if not already decorated:

  1. @collection.appenderdef add(self, append):

  2. or, equivalently

    @collection.appender@collection.adds(1)def add(self, append):

  3. for mapping type, an 'append' may kick out a previous value

    that occupies that slot. consider d['a'] = 'foo'- any previous

    value in d['a'] is discarded.

    @collection.appender@collection.replaces(1)def add(self, entity): key = some_key_func(entity) previous = None if key in self: previous = self[key] self[key] = entity return previous

If the value to append is not allowed in the collection, you mayraise an exception. Something to remember is that the appenderwill be called for each object mapped by a database query. If thedatabase contains rows that violate your collection semantics, youwill need to get creative to fix the problem, as access via thecollection will not work.

If the appender method is internally instrumented, you must alsoreceive the keyword argument ‘_sa_initiator’ and ensure itspromulgation to collection events.

  • static converter(fn)
  • Tag the method as the collection converter.

Deprecated since version 1.3: The collection.converter() handler is deprecated and will be removed in a future release. Please refer to the AttributeEvents.bulk_replace listener interface in conjunction with the event.listen() function.

This optional method will be called when a collection is beingreplaced entirely, as in:

  1. myobj.acollection = [newvalue1, newvalue2]

The converter method will receive the object being assigned and shouldreturn an iterable of values suitable for use by the appendermethod. A converter must not assign values or mutate the collection,its sole job is to adapt the value the user provides into an iterableof values for the ORM’s use.

The default converter implementation will use duck-typing to do theconversion. A dict-like collection will be convert into an iterableof dictionary values, and other types will simply be iterated:

  1. @collection.converterdef convert(self, other):

If the duck-typing of the object does not match the type of thiscollection, a TypeError is raised.

Supply an implementation of this method if you want to expand therange of possible types that can be assigned in bulk or performvalidation on the values about to be assigned.

  • static internallyinstrumented(_fn)
  • Tag the method as instrumented.

This tag will prevent any decoration from being applied to themethod. Use this if you are orchestrating your own calls tocollection_adapter() in one of the basic SQLAlchemyinterface methods, or to prevent an automatic ABC methoddecoration from wrapping your implementation:

  1. # normally an 'extend' method on a list-like class would be
  2. # automatically intercepted and re-implemented in terms of
  3. # SQLAlchemy events and append(). your implementation will
  4. # never be called, unless:
  5. @collection.internally_instrumented
  6. def extend(self, items): ...
  • static iterator(fn)
  • Tag the method as the collection remover.

The iterator method is called with no arguments. It is expected toreturn an iterator over all collection members:

  1. @collection.iteratordef iter(self):

Deprecated since version 1.0: - collection.link() is deprecated and will beremoved in a future release.

  • static linker(fn)
  • Tag the method as a “linked to attribute” event handler.

Deprecated since version 1.0: The collection.linker() handler is deprecated and will be removed in a future release. Please refer to the AttributeEvents.init_collection() and AttributeEvents.dispose_collection() event handlers.

This optional event handler will be called when the collection classis linked to or unlinked from the InstrumentedAttribute. It isinvoked immediately after the ‘_sa_adapter’ property is set onthe instance. A single argument is passed: the collection adapterthat has been linked, or None if unlinking.

  • static remover(fn)
  • Tag the method as the collection remover.

The remover method is called with one positional argument: the valueto remove. The method will be automatically decorated withremoves_return() if not already decorated:

  1. @collection.removerdef zap(self, entity):

  2. or, equivalently

    @collection.remover@collection.removes_return()def zap(self, ):

If the value to remove is not present in the collection, you mayraise an exception or return None to ignore the error.

If the remove method is internally instrumented, you must alsoreceive the keyword argument ‘_sa_initiator’ and ensure itspromulgation to collection events.

  • static removes(arg)
  • Mark the method as removing an entity in the collection.

Adds “remove from collection” handling to the method. The decoratorargument indicates which method argument holds the SQLAlchemy-relevantvalue to be removed. Arguments can be specified positionally (i.e.integer) or by name:

  1. @collection.removes(1)def zap(self, item):

For methods where the value to remove is not known at call-time, usecollection.removes_return.

  • static removes_return()
  • Mark the method as removing an entity in the collection.

Adds “remove from collection” handling to the method. The returnvalue of the method, if any, is considered the value to remove. Themethod arguments are not inspected:

  1. @collection.removes_return()def pop(self):

For methods where the value to remove is known at call-time, usecollection.remove.

  • static replaces(arg)
  • Mark the method as replacing an entity in the collection.

Adds “add to collection” and “remove from collection” handling tothe method. The decorator argument indicates which method argumentholds the SQLAlchemy-relevant value to be added, and return value, ifany will be considered the value to remove.

Arguments can be specified positionally (i.e. integer) or by name:

  1. @collection.replaces(2)def setitem(self, index, item):

Custom Dictionary-Based Collections

The MappedCollection class can be used asa base class for your custom types or as a mix-in to quickly add dictcollection support to other classes. It uses a keying function to delegate tosetitem and delitem:

  1. from sqlalchemy.util import OrderedDict
  2. from sqlalchemy.orm.collections import MappedCollection
  3.  
  4. class NodeMap(OrderedDict, MappedCollection):
  5. """Holds 'Node' objects, keyed by the 'name' attribute with insert order maintained."""
  6.  
  7. def __init__(self, *args, **kw):
  8. MappedCollection.__init__(self, keyfunc=lambda node: node.name)
  9. OrderedDict.__init__(self, *args, **kw)

When subclassing MappedCollection, user-defined versionsof setitem() or delitem() should be decoratedwith collection.internally_instrumented(), if they call downto those same methods on MappedCollection. This because the methodson MappedCollection are already instrumented - calling themfrom within an already instrumented call can cause events to be fired offrepeatedly, or inappropriately, leading to internal state corruption inrare cases:

  1. from sqlalchemy.orm.collections import MappedCollection,\
  2. collection
  3.  
  4. class MyMappedCollection(MappedCollection):
  5. """Use @internally_instrumented when your methods
  6. call down to already-instrumented methods.
  7.  
  8. """
  9.  
  10. @collection.internally_instrumented
  11. def __setitem__(self, key, value, _sa_initiator=None):
  12. # do something with key, value
  13. super(MyMappedCollection, self).__setitem__(key, value, _sa_initiator)
  14.  
  15. @collection.internally_instrumented
  16. def __delitem__(self, key, _sa_initiator=None):
  17. # do something with key
  18. super(MyMappedCollection, self).__delitem__(key, _sa_initiator)

The ORM understands the dict interface just like lists and sets, and willautomatically instrument all dict-like methods if you choose to subclassdict or provide dict-like collection behavior in a duck-typed class. Youmust decorate appender and remover methods, however- there are no compatiblemethods in the basic dictionary interface for SQLAlchemy to use by default.Iteration will go through itervalues() unless otherwise decorated.

Note

Due to a bug in MappedCollection prior to version 0.7.6, thisworkaround usually needs to be called before a custom subclassof MappedCollection which uses collection.internally_instrumented()can be used:

  1. from sqlalchemy.orm.collections import _instrument_class, MappedCollection
  2. _instrument_class(MappedCollection)

This will ensure that the MappedCollection has been properlyinitialized with custom setitem() and delitem()methods before used in a custom subclass.

  • class sqlalchemy.orm.collections.MappedCollection(keyfunc)
  • Bases: builtins.dict

A basic dictionary-based collection class.

Extends dict with the minimal bag semantics that collectionclasses require. set and remove are implemented in termsof a keying function: any callable that takes an object andreturns an object for use as a dictionary key.

  • init(keyfunc)
  • Create a new collection with keying provided by keyfunc.

keyfunc may be any callable that takes an object and returns an objectfor use as a dictionary key.

The keyfunc will be called every time the ORM needs to add a member byvalue-only (such as when loading instances from the database) orremove a member. The usual cautions about dictionary keying apply-keyfunc(object) should return the same output for the life of thecollection. Keying based on mutable properties can result inunreachable instances “lost” in the collection.

  • clear() → None. Remove all items from D.
  • pop(k[, d]) → v, remove specified key and return the corresponding value.
  • If key is not found, d is returned if given, otherwise KeyError is raised

  • popitem() → (k, v), remove and return some (key, value) pair as a

  • 2-tuple; but raise KeyError if D is empty.

  • remove(value, _sa_initiator=None)

  • Remove an item by value, consulting the keyfunc for the key.

  • set(value, _sa_initiator=None)

  • Add an item by value, consulting the keyfunc for the key.

  • setdefault(key, default=None)

  • Insert key with a value of default if key is not in the dictionary.

Return the value for key if key is in the dictionary, else default.

  • update([E, ]**F) → None. Update D from dict/iterable E and F.
  • If E is present and has a .keys() method, then does: for k in E: D[k] = E[k]If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = vIn either case, this is followed by: for k in F: D[k] = F[k]

Instrumentation and Custom Types

Many custom types and existing library classes can be used as a entitycollection type as-is without further ado. However, it is important to notethat the instrumentation process will modify the type, adding decoratorsaround methods automatically.

The decorations are lightweight and no-op outside of relationships, but theydo add unneeded overhead when triggered elsewhere. When using a library classas a collection, it can be good practice to use the “trivial subclass” trickto restrict the decorations to just your usage in relationships. For example:

  1. class MyAwesomeList(some.great.library.AwesomeList):
  2. pass
  3.  
  4. # ... relationship(..., collection_class=MyAwesomeList)

The ORM uses this approach for built-ins, quietly substituting a trivialsubclass when a list, set or dict is used directly.

Collection Internals

Various internal methods.

  • sqlalchemy.orm.collections.bulkreplace(_values, existing_adapter, new_adapter, initiator=None)
  • Load a new collection, firing events based on prior like membership.

Appends instances in values onto the new_adapter. Events will befired for any instance not present in the existing_adapter. Anyinstances in existing_adapter not present in values will haveremove events fired upon them.

  • Parameters
    • values – An iterable of collection member instances

    • existing_adapter – A CollectionAdapter ofinstances to be replaced

    • new_adapter – An empty CollectionAdapterto load with values

  • class sqlalchemy.orm.collections.collection
  • Decorators for entity collection classes.

The decorators fall into two groups: annotations and interception recipes.

The annotating decorators (appender, remover, iterator, linker, converter,internally_instrumented) indicate the method’s purpose and take noarguments. They are not written with parens:

  1. @collection.appenderdef append(self, append):

The recipe decorators all require parens, even those that take noarguments:

  1. @collection.adds('entity')def insert(self, position, entity):

  2. @collection.removes_return()def popitem(self):

  • sqlalchemy.orm.collections.collectionadapter = operator.attrgetter('sa_adapter')
  • Fetch the CollectionAdapter for a collection.

  • class sqlalchemy.orm.collections.CollectionAdapter(attr, owner_state, data)

  • Bridges between the ORM and arbitrary Python collections.

Proxies base-level collection operations (append, remove, iterate)to the underlying Python collection, and emits add/remove events forentities entering or leaving the collection.

The ORM uses CollectionAdapter exclusively for interaction withentity collections.

  • class sqlalchemy.orm.collections.InstrumentedDict
  • Bases: builtins.dict

An instrumented version of the built-in dict.

  • class sqlalchemy.orm.collections.InstrumentedList
  • Bases: builtins.list

An instrumented version of the built-in list.

  • class sqlalchemy.orm.collections.InstrumentedSet
  • Bases: builtins.set

An instrumented version of the built-in set.

  • sqlalchemy.orm.collections.prepareinstrumentation(_factory)
  • Prepare a callable for future use as a collection class factory.

Given a collection class factory (either a type or no-arg callable),return another factory that will produce compatible instances whencalled.

This function is responsible for converting collection_class=listinto the run-time behavior of collection_class=InstrumentedList.