Collection Configuration and Techniques
The relationship()
function defines a linkage between two classes.When the linkage defines a one-to-many or many-to-many relationship, it’srepresented as a Python collection when objects are loaded and manipulated.This section presents additional information about collection configurationand techniques.
Working with Large Collections
The default behavior of relationship()
is to fully loadthe collection of items in, as according to the loading strategy of therelationship. Additionally, the Session
by default only knows how to deleteobjects which are actually present within the session. When a parent instanceis marked for deletion and flushed, the Session
loads its full list of childitems in so that they may either be deleted as well, or have their foreign keyvalue set to null; this is to avoid constraint violations. For largecollections of child items, there are several strategies to bypass fullloading of child items both at load time as well as deletion time.
Dynamic Relationship Loaders
A key feature to enable management of a large collection is the so-called “dynamic”relationship. This is an optional form of relationship()
whichreturns a Query
object in place of a collectionwhen accessed. filter()
criterion may beapplied as well as limits and offsets, either explicitly or via array slices:
- class User(Base):
- __tablename__ = 'user'
- posts = relationship(Post, lazy="dynamic")
- jack = session.query(User).get(id)
- # filter Jack's blog posts
- posts = jack.posts.filter(Post.headline=='this is a post')
- # apply array slices
- posts = jack.posts[5:20]
The dynamic relationship supports limited write operations, via theappend()
and remove()
methods:
- oldpost = jack.posts.filter(Post.headline=='old post').one()
- jack.posts.remove(oldpost)
- jack.posts.append(Post('new post'))
Since the read side of the dynamic relationship always queries thedatabase, changes to the underlying collection will not be visibleuntil the data has been flushed. However, as long as “autoflush” isenabled on the Session
in use, this will occurautomatically each time the collection is about to emit aquery.
To place a dynamic relationship on a backref, use the backref()
function in conjunction with lazy='dynamic'
:
- class Post(Base):
- __table__ = posts_table
- user = relationship(User,
- backref=backref('posts', lazy='dynamic')
- )
Note that eager/lazy loading options cannot be used in conjunction dynamic relationships at this time.
Note
The dynamic_loader()
function is essentially the sameas relationship()
with the lazy='dynamic'
argument specified.
Warning
The “dynamic” loader applies to collections only. It is not validto use “dynamic” loaders with many-to-one, one-to-one, or uselist=Falserelationships. Newer versions of SQLAlchemy emit warnings or exceptionsin these cases.
Setting Noload, RaiseLoad
A “noload” relationship never loads from the database, even whenaccessed. It is configured using lazy='noload'
:
- class MyClass(Base):
- __tablename__ = 'some_table'
- children = relationship(MyOtherClass, lazy='noload')
Above, the children
collection is fully writeable, and changes to it willbe persisted to the database as well as locally available for reading at thetime they are added. However when instances of MyClass
are freshly loadedfrom the database, the children
collection stays empty. The noloadstrategy is also available on a query option basis using theorm.noload()
loader option.
Alternatively, a “raise”-loaded relationship will raise anInvalidRequestError
where the attribute would normallyemit a lazy load:
- class MyClass(Base):
- __tablename__ = 'some_table'
- children = relationship(MyOtherClass, lazy='raise')
Above, attribute access on the children
collection will raise an exceptionif it was not previously eagerloaded. This includes read access but forcollections will also affect write access, as collections can’t be mutatedwithout first loading them. The rationale for this is to ensure that anapplication is not emitting any unexpected lazy loads within a certain context.Rather than having to read through SQL logs to determine that all necessaryattributes were eager loaded, the “raise” strategy will cause unloadedattributes to raise immediately if accessed. The raise strategy isalso available on a query option basis using the orm.raiseload()
loader option.
New in version 1.1: added the “raise” loader strategy.
See also
Preventing unwanted lazy loads using raiseload
Using Passive Deletes
Use passive_deletes
to disable child object loading on a DELETEoperation, in conjunction with “ON DELETE (CASCADE|SET NULL)” on your databaseto automatically cascade deletes to child objects:
- class MyClass(Base):
- __tablename__ = 'mytable'
- id = Column(Integer, primary_key=True)
- children = relationship("MyOtherClass",
- cascade="all, delete-orphan",
- passive_deletes=True)
- class MyOtherClass(Base):
- __tablename__ = 'myothertable'
- id = Column(Integer, primary_key=True)
- parent_id = Column(Integer,
- ForeignKey('mytable.id', ondelete='CASCADE')
- )
Note
To use “ON DELETE CASCADE”, the underlying database engine mustsupport foreign keys.
When using MySQL, an appropriate storage engine must beselected. See CREATE TABLE arguments including Storage Engines for details.
When using SQLite, foreign key support must be enabled explicitly.See Foreign Key Support for details.
When passive_deletes
is applied, the children
relationship will not beloaded into memory when an instance of MyClass
is marked for deletion. Thecascade="all, delete-orphan"
will take effect for instances ofMyOtherClass
which are currently present in the session; however forinstances of MyOtherClass
which are not loaded, SQLAlchemy assumes that“ON DELETE CASCADE” rules will ensure that those rows are deleted by thedatabase.
See also
orm.mapper.passive_deletes
- similar feature on mapper()
Customizing Collection Access
Mapping a one-to-many or many-to-many relationship results in a collection ofvalues accessible through an attribute on the parent instance. By default,this collection is a list
:
- class Parent(Base):
- __tablename__ = 'parent'
- parent_id = Column(Integer, primary_key=True)
- children = relationship(Child)
- parent = Parent()
- parent.children.append(Child())
- print(parent.children[0])
Collections are not limited to lists. Sets, mutable sequences and almost anyother Python object that can act as a container can be used in place of thedefault list, by specifying the collection_class
option onrelationship()
:
- class Parent(Base):
- __tablename__ = 'parent'
- parent_id = Column(Integer, primary_key=True)
- # use a set
- children = relationship(Child, collection_class=set)
- parent = Parent()
- child = Child()
- parent.children.add(child)
- assert child in parent.children
Dictionary Collections
A little extra detail is needed when using a dictionary as a collection.This because objects are always loaded from the database as lists, and a key-generationstrategy must be available to populate the dictionary correctly. Theattribute_mapped_collection()
function is by far the most common wayto achieve a simple dictionary collection. It produces a dictionary class that will apply a particular attributeof the mapped class as a key. Below we map an Item
class containinga dictionary of Note
items keyed to the Note.keyword
attribute:
- from sqlalchemy import Column, Integer, String, ForeignKey
- from sqlalchemy.orm import relationship
- from sqlalchemy.orm.collections import attribute_mapped_collection
- from sqlalchemy.ext.declarative import declarative_base
- Base = declarative_base()
- class Item(Base):
- __tablename__ = 'item'
- id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=attribute_mapped_collection('keyword'),
- cascade="all, delete-orphan")
- class Note(Base):
- __tablename__ = 'note'
- id = Column(Integer, primary_key=True)
- item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
- keyword = Column(String)
- text = Column(String)
- def __init__(self, keyword, text):
- self.keyword = keyword
- self.text = text
Item.notes
is then a dictionary:
- >>> item = Item()
- >>> item.notes['a'] = Note('a', 'atext')
- >>> item.notes.items()
- {'a': <__main__.Note object at 0x2eaaf0>}
attribute_mapped_collection()
will ensure thatthe .keyword
attribute of each Note
complies with the key in thedictionary. Such as, when assigning to Item.notes
, the dictionarykey we supply must match that of the actual Note
object:
- item = Item()
- item.notes = {
- 'a': Note('a', 'atext'),
- 'b': Note('b', 'btext')
- }
The attribute which attribute_mapped_collection()
uses as a keydoes not need to be mapped at all! Using a regular Python @property
allows virtuallyany detail or combination of details about the object to be used as the key, asbelow when we establish it as a tuple of Note.keyword
and the first ten lettersof the Note.text
field:
- class Item(Base):
- __tablename__ = 'item'
- id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=attribute_mapped_collection('note_key'),
- backref="item",
- cascade="all, delete-orphan")
- class Note(Base):
- __tablename__ = 'note'
- id = Column(Integer, primary_key=True)
- item_id = Column(Integer, ForeignKey('item.id'), nullable=False)
- keyword = Column(String)
- text = Column(String)
- @property
- def note_key(self):
- return (self.keyword, self.text[0:10])
- def __init__(self, keyword, text):
- self.keyword = keyword
- self.text = text
Above we added a Note.item
backref. Assigning to this reverse relationship, the Note
is added to the Item.notes
dictionary and the key is generated for us automatically:
- >>> item = Item()
- >>> n1 = Note("a", "atext")
- >>> n1.item = item
- >>> item.notes
- {('a', 'atext'): <__main__.Note object at 0x2eaaf0>}
Other built-in dictionary types include column_mapped_collection()
,which is almost like attribute_mapped_collection()
except given the Column
object directly:
- from sqlalchemy.orm.collections import column_mapped_collection
- class Item(Base):
- __tablename__ = 'item'
- id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=column_mapped_collection(Note.__table__.c.keyword),
- cascade="all, delete-orphan")
as well as mapped_collection()
which is passed any callable function.Note that it’s usually easier to use attribute_mapped_collection()
alongwith a @property
as mentioned earlier:
- from sqlalchemy.orm.collections import mapped_collection
- class Item(Base):
- __tablename__ = 'item'
- id = Column(Integer, primary_key=True)
- notes = relationship("Note",
- collection_class=mapped_collection(lambda note: note.text[0:10]),
- cascade="all, delete-orphan")
Dictionary mappings are often combined with the “Association Proxy” extension to producestreamlined dictionary views. See Proxying to Dictionary Based Collections and Composite Association Proxiesfor examples.
sqlalchemy.orm.collections.
attributemapped_collection
(_attr_name)- A dictionary-based collection type with attribute-based keying.
Returns a MappedCollection
factory with a keying based on the‘attr_name’ attribute of entities in the collection, where attr_name
is the string name of the attribute.
The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.
sqlalchemy.orm.collections.
columnmapped_collection
(_mapping_spec)- A dictionary-based collection type with column-based keying.
Returns a MappedCollection
factory with a keying functiongenerated from mapping_spec, which may be a Column or a sequenceof Columns.
The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.
sqlalchemy.orm.collections.
mappedcollection
(_keyfunc)- A dictionary-based collection type with arbitrary keying.
Returns a MappedCollection
factory with a keying functiongenerated from keyfunc, a callable that takes an entity and returns akey value.
The key value must be immutable for the lifetime of the object. Youcan not, for example, map on foreign key values if those key values willchange during the session, i.e. from None to a database-assigned integerafter a session flush.
Custom Collection Implementations
You can use your own types for collections as well. In simple cases,inheriting from list
or set
, adding custom behavior, is all that’s needed.In other cases, special decorators are needed to tell SQLAlchemy more detailabout how the collection operates.
Do I need a custom collection implementation?
In most cases not at all! The most common use cases for a “custom” collectionis one that validates or marshals incoming values into a new form, such asa string that becomes a class instance, or one which goes astep beyond and represents the data internally in some fashion, presentinga “view” of that data on the outside of a different form.
For the first use case, the orm.validates()
decorator is by farthe simplest way to intercept incoming values in all cases for the purposesof validation and simple marshaling. See Simple Validatorsfor an example of this.
For the second use case, the Association Proxy extension is awell-tested, widely used system that provides a read/write “view” of acollection in terms of some attribute present on the target object. As thetarget attribute can be a @property
that returns virtually anything, awide array of “alternative” views of a collection can be constructed withjust a few functions. This approach leaves the underlying mapped collectionunaffected and avoids the need to carefully tailor collection behavior on amethod-by-method basis.
Customized collections are useful when the collection needs tohave special behaviors upon access or mutation operations that can’totherwise be modeled externally to the collection. They can of coursebe combined with the above two approaches.
Collections in SQLAlchemy are transparently instrumented. Instrumentationmeans that normal operations on the collection are tracked and result inchanges being written to the database at flush time. Additionally, collectionoperations can fire events which indicate some secondary operation must takeplace. Examples of a secondary operation include saving the child item in theparent’s Session
(i.e. the save-update
cascade), as well as synchronizing the state of a bi-directional relationship(i.e. a backref()
).
The collections package understands the basic interface of lists, sets anddicts and will automatically apply instrumentation to those built-in types andtheir subclasses. Object-derived types that implement a basic collectioninterface are detected and instrumented via duck-typing:
- class ListLike(object):
- def __init__(self):
- self.data = []
- def append(self, item):
- self.data.append(item)
- def remove(self, item):
- self.data.remove(item)
- def extend(self, items):
- self.data.extend(items)
- def __iter__(self):
- return iter(self.data)
- def foo(self):
- return 'foo'
append
, remove
, and extend
are known list-like methods, and willbe instrumented automatically. iter
is not a mutator method and won’tbe instrumented, and foo
won’t be either.
Duck-typing (i.e. guesswork) isn’t rock-solid, of course, so you can beexplicit about the interface you are implementing by providing anemulates
class attribute:
- class SetLike(object):
- __emulates__ = set
- def __init__(self):
- self.data = set()
- def append(self, item):
- self.data.add(item)
- def remove(self, item):
- self.data.remove(item)
- def __iter__(self):
- return iter(self.data)
This class looks list-like because of append
, but emulates
forcesit to set-like. remove
is known to be part of the set interface and willbe instrumented.
But this class won’t work quite yet: a little glue is needed to adapt it foruse by SQLAlchemy. The ORM needs to know which methods to use to append,remove and iterate over members of the collection. When using a type likelist
or set
, the appropriate methods are well-known and usedautomatically when present. This set-like class does not provide the expectedadd
method, so we must supply an explicit mapping for the ORM via adecorator.
Annotating Custom Collections via Decorators
Decorators can be used to tag the individual methods the ORM needs to managecollections. Use them when your class doesn’t quite meet the regular interfacefor its container type, or when you otherwise would like to use a different method toget the job done.
- from sqlalchemy.orm.collections import collection
- class SetLike(object):
- __emulates__ = set
- def __init__(self):
- self.data = set()
- @collection.appender
- def append(self, item):
- self.data.add(item)
- def remove(self, item):
- self.data.remove(item)
- def __iter__(self):
- return iter(self.data)
And that’s all that’s needed to complete the example. SQLAlchemy will addinstances via the append
method. remove
and iter
are thedefault methods for sets and will be used for removing and iteration. Defaultmethods can be changed as well:
- from sqlalchemy.orm.collections import collection
- class MyList(list):
- @collection.remover
- def zark(self, item):
- # do something special...
- @collection.iterator
- def hey_use_this_instead_for_iteration(self):
- # ...
There is no requirement to be list-, or set-like at all. Collection classescan be any shape, so long as they have the append, remove and iterateinterface marked for SQLAlchemy’s use. Append and remove methods will becalled with a mapped entity as the single argument, and iterator methods arecalled with no arguments and must return an iterator.
The decorators fall into two groups: annotations and interception recipes.
The annotating decorators (appender, remover, iterator, linker, converter,internally_instrumented) indicate the method’s purpose and take noarguments. They are not written with parens:
- @collection.appenderdef append(self, append): …
The recipe decorators all require parens, even those that take noarguments:
- @collection.adds('entity')def insert(self, position, entity): …
@collection.removes_return()def popitem(self): …
Adds “add to collection” handling to the method. The decoratorargument indicates which method argument holds the SQLAlchemy-relevantvalue. Arguments can be specified positionally (i.e. integer) or byname:
- @collection.adds(1)def push(self, item): …
@collection.adds('entity')def do_stuff(self, thing, entity=None): …
The appender method is called with one positional argument: the valueto append. The method will be automatically decorated with ‘adds(1)’if not already decorated:
- @collection.appenderdef add(self, append): …
or, equivalently
@collection.appender@collection.adds(1)def add(self, append): …
for mapping type, an 'append' may kick out a previous value
that occupies that slot. consider d['a'] = 'foo'- any previous
value in d['a'] is discarded.
@collection.appender@collection.replaces(1)def add(self, entity): key = some_key_func(entity) previous = None if key in self: previous = self[key] self[key] = entity return previous
If the value to append is not allowed in the collection, you mayraise an exception. Something to remember is that the appenderwill be called for each object mapped by a database query. If thedatabase contains rows that violate your collection semantics, youwill need to get creative to fix the problem, as access via thecollection will not work.
If the appender method is internally instrumented, you must alsoreceive the keyword argument ‘_sa_initiator’ and ensure itspromulgation to collection events.
Deprecated since version 1.3: The collection.converter()
handler is deprecated and will be removed in a future release. Please refer to the AttributeEvents.bulk_replace
listener interface in conjunction with the event.listen()
function.
This optional method will be called when a collection is beingreplaced entirely, as in:
- myobj.acollection = [newvalue1, newvalue2]
The converter method will receive the object being assigned and shouldreturn an iterable of values suitable for use by the appender
method. A converter must not assign values or mutate the collection,its sole job is to adapt the value the user provides into an iterableof values for the ORM’s use.
The default converter implementation will use duck-typing to do theconversion. A dict-like collection will be convert into an iterableof dictionary values, and other types will simply be iterated:
- @collection.converterdef convert(self, other): …
If the duck-typing of the object does not match the type of thiscollection, a TypeError is raised.
Supply an implementation of this method if you want to expand therange of possible types that can be assigned in bulk or performvalidation on the values about to be assigned.
This tag will prevent any decoration from being applied to themethod. Use this if you are orchestrating your own calls tocollection_adapter()
in one of the basic SQLAlchemyinterface methods, or to prevent an automatic ABC methoddecoration from wrapping your implementation:
- # normally an 'extend' method on a list-like class would be
- # automatically intercepted and re-implemented in terms of
- # SQLAlchemy events and append(). your implementation will
- # never be called, unless:
- @collection.internally_instrumented
- def extend(self, items): ...
The iterator method is called with no arguments. It is expected toreturn an iterator over all collection members:
- @collection.iteratordef iter(self): …
- static
link
(fn) - Synonym for
collection.linker()
.
Deprecated since version 1.0: - collection.link()
is deprecated and will beremoved in a future release.
Deprecated since version 1.0: The collection.linker()
handler is deprecated and will be removed in a future release. Please refer to the AttributeEvents.init_collection()
and AttributeEvents.dispose_collection()
event handlers.
This optional event handler will be called when the collection classis linked to or unlinked from the InstrumentedAttribute. It isinvoked immediately after the ‘_sa_adapter’ property is set onthe instance. A single argument is passed: the collection adapterthat has been linked, or None if unlinking.
The remover method is called with one positional argument: the valueto remove. The method will be automatically decorated withremoves_return()
if not already decorated:
If the value to remove is not present in the collection, you mayraise an exception or return None to ignore the error.
If the remove method is internally instrumented, you must alsoreceive the keyword argument ‘_sa_initiator’ and ensure itspromulgation to collection events.
Adds “remove from collection” handling to the method. The decoratorargument indicates which method argument holds the SQLAlchemy-relevantvalue to be removed. Arguments can be specified positionally (i.e.integer) or by name:
- @collection.removes(1)def zap(self, item): …
For methods where the value to remove is not known at call-time, usecollection.removes_return.
Adds “remove from collection” handling to the method. The returnvalue of the method, if any, is considered the value to remove. Themethod arguments are not inspected:
- @collection.removes_return()def pop(self): …
For methods where the value to remove is known at call-time, usecollection.remove.
Adds “add to collection” and “remove from collection” handling tothe method. The decorator argument indicates which method argumentholds the SQLAlchemy-relevant value to be added, and return value, ifany will be considered the value to remove.
Arguments can be specified positionally (i.e. integer) or by name:
- @collection.replaces(2)def setitem(self, index, item): …
Custom Dictionary-Based Collections
The MappedCollection
class can be used asa base class for your custom types or as a mix-in to quickly add dict
collection support to other classes. It uses a keying function to delegate tosetitem
and delitem
:
- from sqlalchemy.util import OrderedDict
- from sqlalchemy.orm.collections import MappedCollection
- class NodeMap(OrderedDict, MappedCollection):
- """Holds 'Node' objects, keyed by the 'name' attribute with insert order maintained."""
- def __init__(self, *args, **kw):
- MappedCollection.__init__(self, keyfunc=lambda node: node.name)
- OrderedDict.__init__(self, *args, **kw)
When subclassing MappedCollection
, user-defined versionsof setitem()
or delitem()
should be decoratedwith collection.internally_instrumented()
, if they call downto those same methods on MappedCollection
. This because the methodson MappedCollection
are already instrumented - calling themfrom within an already instrumented call can cause events to be fired offrepeatedly, or inappropriately, leading to internal state corruption inrare cases:
- from sqlalchemy.orm.collections import MappedCollection,\
- collection
- class MyMappedCollection(MappedCollection):
- """Use @internally_instrumented when your methods
- call down to already-instrumented methods.
- """
- @collection.internally_instrumented
- def __setitem__(self, key, value, _sa_initiator=None):
- # do something with key, value
- super(MyMappedCollection, self).__setitem__(key, value, _sa_initiator)
- @collection.internally_instrumented
- def __delitem__(self, key, _sa_initiator=None):
- # do something with key
- super(MyMappedCollection, self).__delitem__(key, _sa_initiator)
The ORM understands the dict
interface just like lists and sets, and willautomatically instrument all dict-like methods if you choose to subclassdict
or provide dict-like collection behavior in a duck-typed class. Youmust decorate appender and remover methods, however- there are no compatiblemethods in the basic dictionary interface for SQLAlchemy to use by default.Iteration will go through itervalues()
unless otherwise decorated.
Note
Due to a bug in MappedCollection prior to version 0.7.6, thisworkaround usually needs to be called before a custom subclassof MappedCollection
which uses collection.internally_instrumented()
can be used:
- from sqlalchemy.orm.collections import _instrument_class, MappedCollection
- _instrument_class(MappedCollection)
This will ensure that the MappedCollection
has been properlyinitialized with custom setitem()
and delitem()
methods before used in a custom subclass.
A basic dictionary-based collection class.
Extends dict with the minimal bag semantics that collectionclasses require. set
and remove
are implemented in termsof a keying function: any callable that takes an object andreturns an object for use as a dictionary key.
keyfunc may be any callable that takes an object and returns an objectfor use as a dictionary key.
The keyfunc will be called every time the ORM needs to add a member byvalue-only (such as when loading instances from the database) orremove a member. The usual cautions about dictionary keying apply-keyfunc(object)
should return the same output for the life of thecollection. Keying based on mutable properties can result inunreachable instances “lost” in the collection.
clear
() → None. Remove all items from D.pop
(k[, d]) → v, remove specified key and return the corresponding value.If key is not found, d is returned if given, otherwise KeyError is raised
popitem
() → (k, v), remove and return some (key, value) pair as a2-tuple; but raise KeyError if D is empty.
Remove an item by value, consulting the keyfunc for the key.
Add an item by value, consulting the keyfunc for the key.
- Insert key with a value of default if key is not in the dictionary.
Return the value for key if key is in the dictionary, else default.
update
([E, ]**F) → None. Update D from dict/iterable E and F.- If E is present and has a .keys() method, then does: for k in E: D[k] = E[k]If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = vIn either case, this is followed by: for k in F: D[k] = F[k]
Instrumentation and Custom Types
Many custom types and existing library classes can be used as a entitycollection type as-is without further ado. However, it is important to notethat the instrumentation process will modify the type, adding decoratorsaround methods automatically.
The decorations are lightweight and no-op outside of relationships, but theydo add unneeded overhead when triggered elsewhere. When using a library classas a collection, it can be good practice to use the “trivial subclass” trickto restrict the decorations to just your usage in relationships. For example:
- class MyAwesomeList(some.great.library.AwesomeList):
- pass
- # ... relationship(..., collection_class=MyAwesomeList)
The ORM uses this approach for built-ins, quietly substituting a trivialsubclass when a list
, set
or dict
is used directly.
Collection Internals
Various internal methods.
sqlalchemy.orm.collections.
bulkreplace
(_values, existing_adapter, new_adapter, initiator=None)- Load a new collection, firing events based on prior like membership.
Appends instances in values
onto the new_adapter
. Events will befired for any instance not present in the existing_adapter
. Anyinstances in existing_adapter
not present in values
will haveremove events fired upon them.
- Parameters
existing_adapter – A
CollectionAdapter
ofinstances to be replacednew_adapter – An empty
CollectionAdapter
to load withvalues
- class
sqlalchemy.orm.collections.
collection
- Decorators for entity collection classes.
The decorators fall into two groups: annotations and interception recipes.
The annotating decorators (appender, remover, iterator, linker, converter,internally_instrumented) indicate the method’s purpose and take noarguments. They are not written with parens:
- @collection.appenderdef append(self, append): …
The recipe decorators all require parens, even those that take noarguments:
- @collection.adds('entity')def insert(self, position, entity): …
@collection.removes_return()def popitem(self): …
sqlalchemy.orm.collections.
collectionadapter
= operator.attrgetter('sa_adapter')Fetch the
CollectionAdapter
for a collection.class
sqlalchemy.orm.collections.
CollectionAdapter
(attr, owner_state, data)- Bridges between the ORM and arbitrary Python collections.
Proxies base-level collection operations (append, remove, iterate)to the underlying Python collection, and emits add/remove events forentities entering or leaving the collection.
The ORM uses CollectionAdapter
exclusively for interaction withentity collections.
An instrumented version of the built-in dict.
An instrumented version of the built-in list.
An instrumented version of the built-in set.
sqlalchemy.orm.collections.
prepareinstrumentation
(_factory)- Prepare a callable for future use as a collection class factory.
Given a collection class factory (either a type or no-arg callable),return another factory that will produce compatible instances whencalled.
This function is responsible for converting collection_class=listinto the run-time behavior of collection_class=InstrumentedList.