Working with Database Metadata¶
With engines and SQL execution down, we are ready to begin some Alchemy. The central element of both SQLAlchemy Core and ORM is the SQL Expression Language which allows for fluent, composable construction of SQL queries. The foundation for these queries are Python objects that represent database concepts like tables and columns. These objects are known collectively as database metadata.
The most common foundational objects for database metadata in SQLAlchemy are
known as MetaData
, Table
, and Column
.
The sections below will illustrate how these objects are used in both a
Core-oriented style as well as an ORM-oriented style.
ORM readers, stay with us!
As with other sections, Core users can skip the ORM sections, but ORM users
would best be familiar with these objects from both perspectives.
The Table
object discussed here is declared in a more indirect
(and also fully Python-typed) way when using the ORM, however there is still
a Table
object within the ORM’s configuration.
Setting up MetaData with Table objects¶
When we work with a relational database, the basic data-holding structure
in the database which we query from is known a table.
In SQLAlchemy, the database “table” is ultimately represented
by a Python object similarly named Table
.
To start using the SQLAlchemy Expression Language, we will want to have
Table
objects constructed that represent all of the database
tables we are interested in working with. The Table
is
constructed programmatically, either directly by using the
Table
constructor, or indirectly by using ORM Mapped classes
(described later at Using ORM Declarative Forms to Define Table Metadata).
Within the construction of Table
, the individual columns and
constraints within the table themselves may be declared, meaning we
explicitly spell out in source code what each column and constraint in the
table looks like, or if working with an existing database there is also the
option to collect this information directly from that database, meaning the
contents of each Table
are reflected without us having
to declare all (or any) of it explicitly.
Whichever kind of approach is used, we always start out with a collection
that will be where we place our tables known as the MetaData
object. This object is essentially a facade around a Python dictionary
that stores a series of Table
objects keyed to their string
name. Constructing this object looks like:
>>> from sqlalchemy import MetaData
>>> metadata_obj = MetaData()
Having a single MetaData
object for an entire application is
the most common case, represented as a module-level variable in a single place
in an application, often in a “models” or “dbschema” type of package. It is
also very common that the MetaData
is accessed via an
ORM-centric registry or base class such as the
Declarative Base, so that this same
MetaData
is shared among ORM- and Core-declared
Table
objects.
There can be multiple MetaData
collections as well, however
it’s typically most helpful if a series of Table
objects that
are related to each other belong to a single MetaData
collection.
Once we have a MetaData
object, we can declare some
Table
objects. This tutorial will start with the classic
SQLAlchemy tutorial model, that of the table user_account
, which would for
example represent the users of a website, and the table address
,
representing a list of email addresses associated with rows in the user_account
table. When not using ORM Declarative models at all, we construct
each Table
object directly, typically assigning each
to a variable
that will be how we will refer to the table in application code:
>>> from sqlalchemy import Table, Column, Integer, String
>>> user_table = Table(
... "user_account",
... metadata_obj,
... Column("id", Integer, primary_key=True),
... Column("name", String(30)),
... Column("fullname", String),
... )
Components of Table
¶
We can observe that the Table
construct looks a lot like
a SQL CREATE TABLE statement; starting with the table name, then listing out
each column, where each column has a name and a datatype. The objects we
use above are:
Table
- represents a database table and assigns itself to aMetaData
collection.Column
- represents a column in a database table, and assigns itself to aTable
object. TheColumn
usually includes a string name and a type object. The collection ofColumn
objects in terms of the parentTable
are typically accessed via an associative array located atTable.c
:>>> user_table.c.name Column('name', String(length=30), table=<user_account>) >>> user_table.c.keys() ['id', 'name', 'fullname']
Integer
,String
- these classes represent SQL datatypes and can be passed to aColumn
with or without necessarily being instantiated. Above, we want to give a length of “30” to the “name” column, so we instantiatedString(30)
. But for “id” and “fullname” we did not specify these, so we can send the class itself.
See also
The reference and API documentation for MetaData
,
Table
and Column
is at Describing Databases with MetaData.
The reference documentation for datatypes is at SQL Datatype Objects.
In an upcoming section, we will illustrate one of the fundamental
functions of Table
which
is to generate DDL on a particular database connection. But first
we will declare a second Table
.
Declaring Simple Constraints¶
The first Column
in the example user_table
includes the
Column.primary_key
parameter which is a shorthand technique
of indicating that this Column
should be part of the primary
key for this table. The primary key itself is normally declared implicitly
and is represented by the PrimaryKeyConstraint
construct,
which we can see on the Table.primary_key
attribute on the Table
object:
>>> user_table.primary_key
PrimaryKeyConstraint(Column('id', Integer(), table=<user_account>, primary_key=True, nullable=False))
The constraint that is most typically declared explicitly is the
ForeignKeyConstraint
object that corresponds to a database
foreign key constraint. When we declare tables that are related to
each other, SQLAlchemy uses the presence of these foreign key constraint
declarations not only so that they are emitted within CREATE statements to
the database, but also to assist in constructing SQL expressions.
A ForeignKeyConstraint
that involves only a single column
on the target table is typically declared using a column-level shorthand notation
via the ForeignKey
object. Below we declare a second table
address
that will have a foreign key constraint referring to the user
table:
>>> from sqlalchemy import ForeignKey
>>> address_table = Table(
... "address",
... metadata_obj,
... Column("id", Integer, primary_key=True),
... Column("user_id", ForeignKey("user_account.id"), nullable=False),
... Column("email_address", String, nullable=False),
... )
The table above also features a third kind of constraint, which in SQL is the
“NOT NULL” constraint, indicated above using the Column.nullable
parameter.
Tip
When using the ForeignKey
object within a
Column
definition, we can omit the datatype for that
Column
; it is automatically inferred from that of the
related column, in the above example the Integer
datatype
of the user_account.id
column.
In the next section we will emit the completed DDL for the user
and
address
table to see the completed result.
Emitting DDL to the Database¶
We’ve constructed a an object structure that represents
two database tables in a database, starting at the root MetaData
object, then into two Table
objects, each of which hold
onto a collection of Column
and Constraint
objects. This object structure will be at the center of most operations
we perform with both Core and ORM going forward.
The first useful thing we can do with this structure will be to emit CREATE
TABLE statements, or DDL, to our SQLite database so that we can insert
and query data from them. We have already all the tools needed to do so, by
invoking the
MetaData.create_all()
method on our MetaData
,
sending it the Engine
that refers to the target database:
>>> metadata_obj.create_all(engine)
BEGIN (implicit)
PRAGMA main.table_...info("user_account")
...
PRAGMA main.table_...info("address")
...
CREATE TABLE user_account (
id INTEGER NOT NULL,
name VARCHAR(30),
fullname VARCHAR,
PRIMARY KEY (id)
)
...
CREATE TABLE address (
id INTEGER NOT NULL,
user_id INTEGER NOT NULL,
email_address VARCHAR NOT NULL,
PRIMARY KEY (id),
FOREIGN KEY(user_id) REFERENCES user_account (id)
)
...
COMMIT
The DDL create process by default includes some SQLite-specific PRAGMA statements
that test for the existence of each table before emitting a CREATE. The full
series of steps are also included within a BEGIN/COMMIT pair to accommodate
for transactional DDL (SQLite does actually support transactional DDL, however
the sqlite3
database driver historically runs DDL in “autocommit” mode).
The create process also takes care of emitting CREATE statements in the correct
order; above, the FOREIGN KEY constraint is dependent on the user
table
existing, so the address
table is created second. In more complicated
dependency scenarios the FOREIGN KEY constraints may also be applied to tables
after the fact using ALTER.
The MetaData
object also features a
MetaData.drop_all()
method that will emit DROP statements in the
reverse order as it would emit CREATE in order to drop schema elements.
Using ORM Declarative Forms to Define Table Metadata¶
When using the ORM, the process by which we declare Table
metadata
is usually combined with the process of declaring mapped classes.
The mapped class is any Python class we’d like to create, which will then
have attributes on it that will be linked to the columns in a database table.
While there are a few varieties of how this is achieved, the most common
style is known as
declarative, and allows us
to declare our user-defined classes and Table
metadata
at once.
Establishing a Declarative Base¶
When using the ORM, the MetaData
collection remains present,
however it itself is associated with an ORM-only construct commonly referred
towards as the Declarative Base. The most expedient way to acquire
a new Declarative Base is to create a new class that subclasses the
SQLAlchemy DeclarativeBase
class:
>>> from sqlalchemy.orm import DeclarativeBase
>>> class Base(DeclarativeBase):
... pass
Above, the Base
class is what we’ll refer towards as the Declarative Base.
When we make new classes that are subclasses of Base
, combined with
appropriate class-level directives, they will each be established as part of an
object relational mapping against a particular database table (or tables, in
advanced usages).
The Declarative Base, when declared as a new class, refers to a
MetaData
collection that is
created for us automatically (options exist to use our own MetaData
object as well); this MetaData
is accessible via the .metadata
class-level attribute. As we create new mapped classes, they each will reference a
Table
within this MetaData
collection:
>>> Base.metadata
MetaData()
The Declarative Base also refers to a collection called registry
, which
is the central “mapper configuration” unit in the SQLAlchemy ORM. While
seldom accessed directly, this object is central to the mapper configuration
process, as a set of ORM mapped classes will coordinate with each other via
this registry. As was the case with MetaData
, our Declarative
Base also created a registry
for us (again with options to
pass our own registry
), which we can access
via the .registry
class variable:
>>> Base.registry
<sqlalchemy.orm.decl_api.registry object at 0x...>
registry
also provides other mapper configurational patterns,
including different ways to acquire a Declarative Base object, as well as class
decorators and class-processing functions which allow user-defined classes to
be mapped without using any particular base class. Therefore, keep in mind that
all the ORM patterns here that use “declarative base” can just as easily
use other patterns based on class decorators or configurational functions.
Declaring Mapped Classes¶
With the Base
class established, we can now define ORM mapped classes
for the user_account
and address
tables in terms of new classes User
and
Address
. We illustrate below the most modern form of Declarative, which
is driven from PEP 484 type annotations using a special type
Mapped
, which indicates attributes to be mapped as particular
types:
>>> from typing import List
>>> from typing import Optional
>>> from sqlalchemy.orm import Mapped
>>> from sqlalchemy.orm import mapped_column
>>> from sqlalchemy.orm import relationship
>>> class User(Base):
... __tablename__ = "user_account"
...
... id: Mapped[int] = mapped_column(primary_key=True)
... name: Mapped[str] = mapped_column(String(30))
... fullname: Mapped[Optional[str]]
...
... addresses: Mapped[List["Address"]] = relationship(back_populates="user")
...
... def __repr__(self) -> str:
... return f"User(id={self.id!r}, name={self.name!r}, fullname={self.fullname!r})"
>>> class Address(Base):
... __tablename__ = "address"
...
... id: Mapped[int] = mapped_column(primary_key=True)
... email_address: Mapped[str]
... user_id = mapped_column(ForeignKey("user_account.id"))
...
... user: Mapped[User] = relationship(back_populates="addresses")
...
... def __repr__(self) -> str:
... return f"Address(id={self.id!r}, email_address={self.email_address!r})"
The two classes above, User
and Address
, are now referred towards
as ORM Mapped Classes, and are available for use in
ORM persistence and query operations, which will be described later. Details
about these classes include:
Each class refers to a
Table
object that was generated as part of the declarative mapping process, and are equivalent in structure to theTable
objects we constructed directly in the previous Core section. ThisTable
is available from an attribute added to the class called__table__
.To indicate columns in the
Table
, we use themapped_column()
construct, in combination with typing annotations based on theMapped
type.For columns with simple datatypes and no other options, we can indicate a
Mapped
type annotation alone, using simple Python types likeint
andstr
to meanInteger
andString
. Customization of how Python types are interpreted within the Declarative mapping process is very open ended; see the section Using Annotated Declarative Table (Type Annotated Forms for mapped_column()) for background.A column can be declared as “nullable” or “not null” based on the presence of the
Optional[]
type annotation; alternatively, themapped_column.nullable
parameter may be used instead.Usage of explicit typing annotations is completely optional. We can also use
mapped_column()
without annotations. When using this form, we would use more explicit type objects likeInteger
andString
as well asnullable=False
as needed within eachmapped_column()
construct.Two additional attributes,
User.addresses
andAddress.user
, define a different kind of attribute calledrelationship()
, which features similar annotation-aware configuration styles as shown. Therelationship()
construct is discussed more fully at Working with Related Objects.The classes are automatically given an
__init__()
method if we don’t declare one of our own. The default form of this method accepts all attribute names as optional keyword arguments:>>> sandy = User(name="sandy", fullname="Sandy Cheeks")
The
__repr__()
methods are added so that we get a readable string output; there’s no requirement for these methods to be here.
See also
ORM Mapping Styles - full background on different ORM configurational styles.
Declarative Mapping - overview of Declarative class mapping
Declarative Table with mapped_column() - detail on how to use
mapped_column()
and Mapped
to define the columns
within a Table
to be mapped when using Declarative.
Emitting DDL to the database from an ORM mapping¶
As our ORM mapped classes refer to Table
objects contained
within a MetaData
collection, emitting DDL given the
Declarative Base uses the same process as that described previously at
Emitting DDL to the Database. In our case, we have already generated the
user
and address
tables in our SQLite database. If we had not done so
already, we would be free to make use of the MetaData
associated with our ORM Declarative Base class in order to do so, by accessing
the collection from the Base.metadata
attribute and then using
MetaData.create_all()
as before:
Base.metadata.create_all(engine)
Combining Core Table Declarations with ORM Declarative¶
As an alternative approach to the mapping process shown previously
at Declaring Mapped Classes, we may also make
use of the Table
objects we created directly in the section
Setting up MetaData with Table objects in conjunction with
declarative mapped classes from a Declarative Base
class.
This form is called
Declarative with Imperative Table,
and it consists of assigning a Table
object to the
.__table__
attribute directly, rather than having the declarative process
generate it from the .__tablename__
attribute with separate
Mapped
and/or mapped_column()
directives. This is
illustrated below by using our pre-existing user_table
and
address_table
Table
objects to map them to new classes
(note to readers running code: these examples are for illustration only
and should not be run):
class Base(DeclarativeBase):
pass
class User(Base):
__table__ = user_table
addresses: Mapped[List["Address"]] = relationship(back_populates="user")
def __repr__(self):
return f"User({self.name!r}, {self.fullname!r})"
class Address(Base):
__table__ = address_table
user: Mapped["User"] = relationship(back_populates="addresses")
def __repr__(self):
return f"Address({self.email_address!r})"
The above two classes, User
and Address
, are equivalent to those which
we declared in the previous mapping example using __tablename__
and
mapped_column()
.
The traditional “declarative base” approach using __tablename__
to
automatically generate Table
objects remains the most popular
method to declare table metadata. However, disregarding the ORM mapping
functionality it achieves, as far as table declaration it’s merely a syntactical
convenience on top of the Table
constructor.
We will next refer to our ORM mapped classes above when we talk about data manipulation in terms of the ORM, in the section Inserting Rows using the ORM Unit of Work pattern.
Table Reflection¶
To round out the section on working with table metadata, we will illustrate
another operation that was mentioned at the beginning of the section,
that of table reflection. Table reflection refers to the process of
generating Table
and related objects by reading the current
state of a database. Whereas in the previous sections we’ve been declaring
Table
objects in Python and then emitting DDL to the database,
the reflection process does it in reverse.
As an example of reflection, we will create a new Table
object which represents the some_table
object we created manually in
the earlier sections of this document. There are again some varieties of
how this is performed, however the most basic is to construct a
Table
object, given the name of the table and a
MetaData
collection to which it will belong, then
instead of indicating individual Column
and
Constraint
objects, pass it the target Engine
using the Table.autoload_with
parameter:
>>> some_table = Table("some_table", metadata_obj, autoload_with=engine)
BEGIN (implicit)
PRAGMA main.table_...info("some_table")
[raw sql] ()
SELECT sql FROM (SELECT * FROM sqlite_master UNION ALL SELECT * FROM sqlite_temp_master) WHERE name = ? AND type in ('table', 'view')
[raw sql] ('some_table',)
PRAGMA main.foreign_key_list("some_table")
...
PRAGMA main.index_list("some_table")
...
ROLLBACK
At the end of the process, the some_table
object now contains the
information about the Column
objects present in the table, and
the object is usable in exactly the same way as a Table
that
we declared explicitly:
>>> some_table
Table('some_table', MetaData(),
Column('x', INTEGER(), table=<some_table>),
Column('y', INTEGER(), table=<some_table>),
schema=None)
See also
Read more about table and schema reflection at Reflecting Database Objects.
For ORM-related variants of table reflection, the section Mapping Declaratively with Reflected Tables includes an overview of the available options.