Release: 2.0.0b4 | Release Date: December 5, 2022

SQLAlchemy 2.0 Documentation

SQLAlchemy 2.0 Documentation

Home

SQLAlchemy Unified Tutorial

  • Establishing Connectivity - the Engine
  • Working with Transactions and the DBAPI
  • Working with Database Metadata¶
    • Setting up MetaData with Table objects
    • Declaring Simple Constraints
    • Emitting DDL to the Database
    • Defining Table Metadata with the ORM
      • Establishing a Declarative Base
      • Declaring Mapped Classes
      • Emitting DDL to the database from an ORM mapping
      • Combining Core Table Declarations with ORM Declarative
    • Table Reflection
  • Working with Data
  • Data Manipulation with the ORM
  • Working with Related Objects
  • Further Reading

Home

  • Previous: Working with Transactions and the DBAPI
  • Next: Working with Data
  • Up: Home
    • SQLAlchemy Unified Tutorial
  • On this page:
    • Working with Database Metadata
      • Setting up MetaData with Table objects
      • Declaring Simple Constraints
      • Emitting DDL to the Database
      • Defining Table Metadata with the ORM
        • Establishing a Declarative Base
        • Declaring Mapped Classes
        • Emitting DDL to the database from an ORM mapping
        • Combining Core Table Declarations with ORM Declarative
      • Table Reflection

SQLAlchemy 1.4 / 2.0 Tutorial

This page is part of the SQLAlchemy Unified Tutorial.

Previous: Working with Transactions and the DBAPI | Next: Working with Data

Working with Database Metadata¶

With engines and SQL execution down, we are ready to begin some Alchemy. The central element of both SQLAlchemy Core and ORM is the SQL Expression Language which allows for fluent, composable construction of SQL queries. The foundation for these queries are Python objects that represent database concepts like tables and columns. These objects are known collectively as database metadata.

The most common foundational objects for database metadata in SQLAlchemy are known as MetaData, Table, and Column. The sections below will illustrate how these objects are used in both a Core-oriented style as well as an ORM-oriented style.

ORM readers, stay with us!

As with other sections, Core users can skip the ORM sections, but ORM users would best be familiar with these objects from both perspectives.

Setting up MetaData with Table objects¶

When we work with a relational database, the basic structure that we create and query from is known as a table. In SQLAlchemy, the “table” is represented by a Python object similarly named Table.

To start using the SQLAlchemy Expression Language, we will want to have Table objects constructed that represent all of the database tables we are interested in working with. Each Table may be declared, meaning we explicitly spell out in source code what the table looks like, or may be reflected, which means we generate the object based on what’s already present in a particular database. The two approaches can also be blended in many ways.

Whether we will declare or reflect our tables, we start out with a collection that will be where we place our tables known as the MetaData object. This object is essentially a facade around a Python dictionary that stores a series of Table objects keyed to their string name. Constructing this object looks like:

>>> from sqlalchemy import MetaData
>>> metadata_obj = MetaData()

Having a single MetaData object for an entire application is the most common case, represented as a module-level variable in a single place in an application, often in a “models” or “dbschema” type of package. There can be multiple MetaData collections as well, however it’s typically most helpful if a series of Table objects that are related to each other belong to a single MetaData collection.

Once we have a MetaData object, we can declare some Table objects. This tutorial will start with the classic SQLAlchemy tutorial model, that of the table user, which would for example represent the users of a website, and the table address, representing a list of email addresses associated with rows in the user table. We normally assign each Table object to a variable that will be how we will refer to the table in application code:

>>> from sqlalchemy import Table, Column, Integer, String
>>> user_table = Table(
...     "user_account",
...     metadata_obj,
...     Column("id", Integer, primary_key=True),
...     Column("name", String(30)),
...     Column("fullname", String),
... )

We can observe that the above Table construct looks a lot like a SQL CREATE TABLE statement; starting with the table name, then listing out each column, where each column has a name and a datatype. The objects we use above are:

  • Table - represents a database table and assigns itself to a MetaData collection.

  • Column - represents a column in a database table, and assigns itself to a Table object. The Column usually includes a string name and a type object. The collection of Column objects in terms of the parent Table are typically accessed via an associative array located at Table.c:

    >>> user_table.c.name
    Column('name', String(length=30), table=<user_account>)
    
    >>> user_table.c.keys()
    ['id', 'name', 'fullname']
  • Integer, String - these classes represent SQL datatypes and can be passed to a Column with or without necessarily being instantiated. Above, we want to give a length of “30” to the “name” column, so we instantiated String(30). But for “id” and “fullname” we did not specify these, so we can send the class itself.

See also

The reference and API documentation for MetaData, Table and Column is at Describing Databases with MetaData. The reference documentation for datatypes is at SQL Datatype Objects.

In an upcoming section, we will illustrate one of the fundamental functions of Table which is to generate DDL on a particular database connection. But first we will declare a second Table.

Declaring Simple Constraints¶

The first Column in the above user_table includes the Column.primary_key parameter which is a shorthand technique of indicating that this Column should be part of the primary key for this table. The primary key itself is normally declared implicitly and is represented by the PrimaryKeyConstraint construct, which we can see on the Table.primary_key attribute on the Table object:

>>> user_table.primary_key
PrimaryKeyConstraint(Column('id', Integer(), table=<user_account>, primary_key=True, nullable=False))

The constraint that is most typically declared explicitly is the ForeignKeyConstraint object that corresponds to a database foreign key constraint. When we declare tables that are related to each other, SQLAlchemy uses the presence of these foreign key constraint declarations not only so that they are emitted within CREATE statements to the database, but also to assist in constructing SQL expressions.

A ForeignKeyConstraint that involves only a single column on the target table is typically declared using a column-level shorthand notation via the ForeignKey object. Below we declare a second table address that will have a foreign key constraint referring to the user table:

>>> from sqlalchemy import ForeignKey
>>> address_table = Table(
...     "address",
...     metadata_obj,
...     Column("id", Integer, primary_key=True),
...     Column("user_id", ForeignKey("user_account.id"), nullable=False),
...     Column("email_address", String, nullable=False),
... )

The table above also features a third kind of constraint, which in SQL is the “NOT NULL” constraint, indicated above using the Column.nullable parameter.

Tip

When using the ForeignKey object within a Column definition, we can omit the datatype for that Column; it is automatically inferred from that of the related column, in the above example the Integer datatype of the user_account.id column.

In the next section we will emit the completed DDL for the user and address table to see the completed result.

Emitting DDL to the Database¶

We’ve constructed a fairly elaborate object hierarchy to represent two database tables, starting at the root MetaData object, then into two Table objects, each of which hold onto a collection of Column and Constraint objects. This object structure will be at the center of most operations we perform with both Core and ORM going forward.

The first useful thing we can do with this structure will be to emit CREATE TABLE statements, or DDL, to our SQLite database so that we can insert and query data from them. We have already all the tools needed to do so, by invoking the MetaData.create_all() method on our MetaData, sending it the Engine that refers to the target database:

>>> metadata_obj.create_all(engine)
BEGIN (implicit) PRAGMA main.table_...info("user_account") ... PRAGMA main.table_...info("address") ... CREATE TABLE user_account ( id INTEGER NOT NULL, name VARCHAR(30), fullname VARCHAR, PRIMARY KEY (id) ) ... CREATE TABLE address ( id INTEGER NOT NULL, user_id INTEGER NOT NULL, email_address VARCHAR NOT NULL, PRIMARY KEY (id), FOREIGN KEY(user_id) REFERENCES user_account (id) ) ... COMMIT

The DDL create process by default includes some SQLite-specific PRAGMA statements that test for the existence of each table before emitting a CREATE. The full series of steps are also included within a BEGIN/COMMIT pair to accommodate for transactional DDL (SQLite does actually support transactional DDL, however the sqlite3 database driver historically runs DDL in “autocommit” mode).

The create process also takes care of emitting CREATE statements in the correct order; above, the FOREIGN KEY constraint is dependent on the user table existing, so the address table is created second. In more complicated dependency scenarios the FOREIGN KEY constraints may also be applied to tables after the fact using ALTER.

The MetaData object also features a MetaData.drop_all() method that will emit DROP statements in the reverse order as it would emit CREATE in order to drop schema elements.

Migration tools are usually appropriate

Overall, the CREATE / DROP feature of MetaData is useful for test suites, small and/or new applications, and applications that use short-lived databases. For management of an application database schema over the long term however, a schema management tool such as Alembic, which builds upon SQLAlchemy, is likely a better choice, as it can manage and orchestrate the process of incrementally altering a fixed database schema over time as the design of the application changes.

Defining Table Metadata with the ORM¶

This ORM-only section will provide an example declaring the same database structure illustrated in the previous section, using a more ORM-centric configuration paradigm. When using the ORM, the process by which we declare Table metadata is usually combined with the process of declaring mapped classes. The mapped class is any Python class we’d like to create, which will then have attributes on it that will be linked to the columns in a database table. While there are a few varieties of how this is achieved, the most common style is known as declarative, and allows us to declare our user-defined classes and Table metadata at once.

Establishing a Declarative Base¶

When using the ORM, the MetaData collection remains present, however it itself is associated with an ORM-only construct commonly referred towards as the Declarative Base. The most expedient way to acquire a new Declarative Base is to create a new class that subclasses the SQLAlchemy DeclarativeBase class:

>>> from sqlalchemy.orm import DeclarativeBase
>>> class Base(DeclarativeBase):
...     pass

Above, the Base class is what we’ll refer towards as the Declarative Base. When we make new classes that are subclasses of Base, combined with appropriate class-level directives, they will each be established as part of an object relational mapping against a particular database table (or tables, in advanced usages).

The Declarative Base, when declared as a new class, refers to a MetaData collection that is created for us automatically (options exist to use our own MetaData object as well); this MetaData is accessible via the .metadata class-level attribute. As we create new mapped classes, they each will reference a Table within this MetaData collection:

>>> Base.metadata
MetaData()

The Declarative Base also refers to a collection called registry, which is the central “mapper configuration” unit in the SQLAlchemy ORM. While seldom accessed directly, this object is central to the mapper configuration process, as a set of ORM mapped classes will coordinate with each other via this registry. As was the case with MetaData, our Declarative Base also created a registry for us (again with options to pass our own registry), which we can access via the .registry class variable:

>>> Base.registry
<sqlalchemy.orm.decl_api.registry object at 0x...>

registry also provides other mapper configurational patterns, including different ways to acquire a Declarative Base object, as well as class decorators and class-processing functions which allow user-defined classes to be mapped without using any particular base class. Therefore, keep in mind that all the ORM patterns here that use “declarative base” can just as easily use other patterns based on class decorators or configurational functions.

Declaring Mapped Classes¶

With the Base class established, we can now define ORM mapped classes for the user_account and address tables in terms of new classes User and Address. We illustrate below the most modern form of Declarative, which is driven from PEP 484 type annotations using a special type Mapped, which indicates attributes to be mapped as particular types:

>>> from typing import List
>>> from typing import Optional
>>> from sqlalchemy.orm import Mapped
>>> from sqlalchemy.orm import mapped_column
>>> from sqlalchemy.orm import relationship

>>> class User(Base):
...     __tablename__ = "user_account"
...
...     id: Mapped[int] = mapped_column(primary_key=True)
...     name: Mapped[str] = mapped_column(String(30))
...     fullname: Mapped[Optional[str]]
...
...     addresses: Mapped[List["Address"]] = relationship(back_populates="user")
...
...     def __repr__(self) -> str:
...         return f"User(id={self.id!r}, name={self.name!r}, fullname={self.fullname!r})"

>>> class Address(Base):
...     __tablename__ = "address"
...
...     id: Mapped[int] = mapped_column(primary_key=True)
...     email_address: Mapped[str]
...     user_id = mapped_column(ForeignKey("user_account.id"))
...
...     user: Mapped[User] = relationship(back_populates="addresses")
...
...     def __repr__(self) -> str:
...         return f"Address(id={self.id!r}, email_address={self.email_address!r})"

The two classes above, User and Address, are now referred towards as ORM Mapped Classes, and are available for use in ORM persistence and query operations, which will be described later. Details about these classes include:

  • Each class refers to a Table object that was generated as part of the declarative mapping process, and are equivalent in structure to the Table objects we constructed directly in the previous Core section. This Table is available from an attribute added to the class called __table__.

  • To indicate columns in the Table, we use the mapped_column() construct, in combination with typing annotations based on the Mapped type.

  • For columns with simple datatypes and no other options, we can indicate a Mapped type annotation alone, using simple Python types like int and str to mean Integer and String. Customization of how Python types are interpreted within the Declarative mapping process is very open ended; see the section Using Annotated Declarative Table (Type Annotated Forms for mapped_column()) for background.

  • A column can be declared as “nullable” or “not null” based on the presence of the Optional[] type annotation; alternatively, the mapped_column.nullable parameter may be used instead.

  • Usage of explicit typing annotations is completely optional. We can also use mapped_column() without annotations. When using this form, we would use more explicit type objects like Integer and String as well as nullable=False as needed within each mapped_column() construct.

  • Two additional attributes, User.addresses and Address.user, define a different kind of attribute called relationship(), which features similar annotation-aware configuration styles as shown. The relationship() construct is discussed more fully at Working with Related Objects.

  • The classes are automatically given an __init__() method if we don’t declare one of our own. The default form of this method accepts all attribute names as optional keyword arguments:

    >>> sandy = User(name="sandy", fullname="Sandy Cheeks")
  • The __repr__() methods are added so that we get a readable string output; there’s no requirement for these methods to be here.

Where’d the old Declarative go?

Users of SQLAlchemy 1.4 or previous will note that the above mapping uses a dramatically different form than before; not only does it use mapped_column() instead of Column in the Declarative mapping, it also uses Python type annotations to derive column information.

To provide context for users of the “old” way, Declarative mappings can still be made using Column objects (as well as using the declarative_base() function to create the base class) as before, and these forms will continue to be supported with no plans to remove support. The reason these two facilities are superseded by new constructs is first and foremost to integrate smoothly with PEP 484 tools, including IDEs such as VSCode and type checkers such as Mypy and Pyright, without the need for plugins. Secondly, deriving the declarations from type annotations is part of SQLAlchemy’s integration with Python dataclasses, which can now be generated natively from mappings.

For users who like the “old” way, but still desire their IDEs to not mistakenly report typing errors for their declarative mappings, the mapped_column() construct is a drop-in replacement for Column in an ORM Declarative mapping (note that mapped_column() is for ORM Declarative mappings only; it can’t be used within a Table construct), and the type annotations are optional. Our mapping above can be written without annotations as:

class User(Base):
    __tablename__ = "user_account"

    id = mapped_column(Integer, primary_key=True)
    name = mapped_column(String(30), nullable=False)
    fullname = mapped_column(String)

    addresses = relationship("Address", back_populates="user")

    # ... definition continues

The above class has an advantage over one that uses Column directly, in that the User class as well as instances of User will indicate the correct typing information to typing tools, without the use of plugins. mapped_column() also allows for additional ORM-specific parameters to configure behaviors such as deferred column loading, which previously needed a separate deferred() function to be used with Column.

There’s also an example of converting an old-style Declarative class to the new style, which can be seen at ORM Declarative Models in the What’s New in SQLAlchemy 2.0? guide.

See also

ORM Mapping Styles - full background on different ORM configurational styles.

Declarative Mapping - overview of Declarative class mapping

Declarative Table with mapped_column() - detail on how to use mapped_column() and Mapped to define the columns within a Table to be mapped when using Declarative.

Emitting DDL to the database from an ORM mapping¶

As our ORM mapped classes refer to Table objects contained within a MetaData collection, emitting DDL given the Declarative Base uses the same process as that described previously at Emitting DDL to the Database. In our case, we have already generated the user and address tables in our SQLite database. If we had not done so already, we would be free to make use of the MetaData associated with our ORM Declarative Base class in order to do so, by accessing the collection from the Base.metadata attribute and then using MetaData.create_all() as before:

Base.metadata.create_all(engine)

Combining Core Table Declarations with ORM Declarative¶

As an alternative approach to the mapping process shown previously at Declaring Mapped Classes, we may also make use of the Table objects we created directly in the section Setting up MetaData with Table objects in conjunction with declarative mapped classes from a Declarative Base class.

This form is called Declarative with Imperative Table, and it consists of assigning a Table object to the .__table__ attribute directly, rather than having the declarative process generate it from the .__tablename__ attribute with separate Mapped and/or mapped_column() directives. This is illustrated below by using our pre-existing user_table and address_table Table objects to map them to new classes (note to readers running code: these examples are for illustration only and should not be run):

class Base(DeclarativeBase):
    pass


class User(Base):
    __table__ = user_table

    addresses: Mapped[List["Address"]] = relationship(back_populates="user")

    def __repr__(self):
        return f"User({self.name!r}, {self.fullname!r})"


class Address(Base):
    __table__ = address_table

    user: Mapped["User"] = relationship(back_populates="addresses")

    def __repr__(self):
        return f"Address({self.email_address!r})"

The above two classes, User and Address, are equivalent to those which we declared in the previous mapping example using __tablename__ and mapped_column().

The traditional “declarative base” approach using __tablename__ to automatically generate Table objects remains the most popular method to declare table metadata. However, disregarding the ORM mapping functionality it achieves, as far as table declaration it’s merely a syntactical convenience on top of the Table constructor.

We will next refer to our ORM mapped classes above when we talk about data manipulation in terms of the ORM, in the section Inserting Rows using the ORM Unit of Work pattern.

Table Reflection¶

To round out the section on working with table metadata, we will illustrate another operation that was mentioned at the beginning of the section, that of table reflection. Table reflection refers to the process of generating Table and related objects by reading the current state of a database. Whereas in the previous sections we’ve been declaring Table objects in Python and then emitting DDL to the database, the reflection process does it in reverse.

As an example of reflection, we will create a new Table object which represents the some_table object we created manually in the earlier sections of this document. There are again some varieties of how this is performed, however the most basic is to construct a Table object, given the name of the table and a MetaData collection to which it will belong, then instead of indicating individual Column and Constraint objects, pass it the target Engine using the Table.autoload_with parameter:

>>> some_table = Table("some_table", metadata_obj, autoload_with=engine)
BEGIN (implicit) PRAGMA main.table_...info("some_table") [raw sql] () SELECT sql FROM (SELECT * FROM sqlite_master UNION ALL SELECT * FROM sqlite_temp_master) WHERE name = ? AND type in ('table', 'view') [raw sql] ('some_table',) PRAGMA main.foreign_key_list("some_table") ... PRAGMA main.index_list("some_table") ... ROLLBACK

At the end of the process, the some_table object now contains the information about the Column objects present in the table, and the object is usable in exactly the same way as a Table that we declared explicitly:

>>> some_table
Table('some_table', MetaData(),
    Column('x', INTEGER(), table=<some_table>),
    Column('y', INTEGER(), table=<some_table>),
    schema=None)

See also

Read more about table and schema reflection at Reflecting Database Objects.

For ORM-related variants of table reflection, the section Mapping Declaratively with Reflected Tables includes an overview of the available options.

SQLAlchemy 1.4 / 2.0 Tutorial

Next Tutorial Section: Working with Data

Previous: Working with Transactions and the DBAPI Next: Working with Data
© Copyright 2007-2022, the SQLAlchemy authors and contributors.

flambé! the dragon and The Alchemist image designs created and generously donated by Rotem Yaari.

Created using Sphinx 5.3.0. Documentation last generated: Mon 05 Dec 2022 06:16:03 PM