types

SQLAlchemy custom types for use with the ORM.

class advanced_alchemy.types.GUID[source]

Bases: TypeDecorator

Platform-independent GUID type.

Uses PostgreSQL’s UUID type (Postgres, DuckDB, Cockroach), MSSQL’s UNIQUEIDENTIFIER type, Oracle’s RAW(16) type, otherwise uses BINARY(16) or CHAR(32), storing as stringified hex values.

Will accept stringified UUIDs as a hexstring or an actual UUID

__init__(*args, binary=True, **kwargs)[source]

Construct a TypeDecorator.

Arguments sent here are passed to the constructor of the class assigned to the impl class level attribute, assuming the impl is a callable, and the resulting object is assigned to the self.impl instance attribute (thus overriding the class attribute of the same name).

If the class level impl is not a callable (the unusual case), it will be assigned to the same instance attribute ‘as-is’, ignoring those arguments passed to the constructor.

Subclasses can override this to customize the generation of self.impl entirely.

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

compare_values(x, y)[source]

Compare two values for equality.

Return type:

bool

impl: Union[TypeEngine[Any], Type[TypeEngine[Any]]] = BINARY(length=16)
load_dialect_impl(dialect)[source]

Return a TypeEngine object corresponding to a dialect.

This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the TypeDecorator implementation of type_engine() to help determine what type should ultimately be returned for a given TypeDecorator.

By default returns self.impl.

Return type:

Any

process_bind_param(value, dialect)[source]

Receive a bound parameter value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.

The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

Parameters:
  • value (Union[bytes, str, UUID, None]) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Union[bytes, str, None]

See also

Augmenting Existing Types

_types.TypeDecorator.process_result_value()

process_result_value(value, dialect)[source]

Receive a result-row column value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.

The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.

Parameters:
  • value (Union[bytes, str, UUID, None]) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Optional[UUID]

See also

Augmenting Existing Types

_types.TypeDecorator.process_bind_param()

property python_type: type[UUID]

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

static to_uuid(value)[source]
Return type:

Optional[UUID]

class advanced_alchemy.types.ORA_JSONB[source]

Bases: TypeDecorator, SchemaType

Oracle Binary JSON type.

JsonB = _JSON().with_variant(PG_JSONB, “postgresql”).with_variant(ORA_JSONB, “oracle”)

__init__(*args, **kwargs)[source]

Initialize JSON type

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

coerce_compared_value(op, value)[source]

Suggest a type for a ‘coerced’ Python value in an expression.

By default, returns self. This method is called by the expression system when an object using this type is on the left or right side of an expression against a plain Python object which does not yet have a SQLAlchemy type assigned:

expr = table.c.somecolumn + 35

Where above, if somecolumn uses this type, this method will be called with the value operator.add and 35. The return value is whatever SQLAlchemy type should be used for 35 for this particular operation.

Return type:

Any

impl

alias of BLOB

load_dialect_impl(dialect)[source]

Return a TypeEngine object corresponding to a dialect.

This is an end-user override hook that can be used to provide differing types depending on the given dialect. It is used by the TypeDecorator implementation of type_engine() to help determine what type should ultimately be returned for a given TypeDecorator.

By default returns self.impl.

Return type:

TypeEngine

process_bind_param(value, dialect)[source]

Receive a bound parameter value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.

The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

Parameters:
  • value (Any) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Optional[Any]

See also

Augmenting Existing Types

_types.TypeDecorator.process_result_value()

process_result_value(value, dialect)[source]

Receive a result-row column value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.

The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.

Parameters:
  • value (Optional[bytes]) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Optional[Any]

See also

Augmenting Existing Types

_types.TypeDecorator.process_bind_param()

property python_type: type[dict[str, Any]]

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

class advanced_alchemy.types.DateTimeUTC[source]

Bases: TypeDecorator

Timezone Aware DateTime.

Ensure UTC is stored in the database and that TZ aware dates are returned for all dialects.

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

impl: Union[TypeEngine[Any], Type[TypeEngine[Any]]] = DateTime(timezone=True)
process_bind_param(value, dialect)[source]

Receive a bound parameter value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for incoming data values. This method is called at statement execution time and is passed the literal Python data value which is to be associated with a bound parameter in the statement.

The operation could be anything desired to perform custom behavior, such as transforming or serializing data. This could also be used as a hook for validating logic.

Parameters:
  • value (Optional[datetime]) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Optional[datetime]

See also

Augmenting Existing Types

_types.TypeDecorator.process_result_value()

process_result_value(value, dialect)[source]

Receive a result-row column value to be converted.

Custom subclasses of _types.TypeDecorator should override this method to provide custom behaviors for data values being received in result rows coming from the database. This method is called at result fetching time and is passed the literal Python data value that’s extracted from a database result row.

The operation could be anything desired to perform custom behavior, such as transforming or deserializing data.

Parameters:
  • value (Optional[datetime]) – Data to operate upon, of any type expected by this method in the subclass. Can be None.

  • dialect (Dialect) – the Dialect in use.

Return type:

Optional[datetime]

See also

Augmenting Existing Types

_types.TypeDecorator.process_bind_param()

property python_type: type[datetime]

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

class advanced_alchemy.types.EncryptedString[source]

Bases: TypeDecorator

SQLAlchemy TypeDecorator for storing encrypted string values in a database.

This type provides transparent encryption/decryption of string values using the specified backend. It extends sqlalchemy.types.TypeDecorator and implements String as its underlying type.

Parameters:
  • key (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).

  • backend (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.

  • length (int | None) – The length of the unencrypted string. This is used for documentation and validation purposes only, as encrypted strings will be longer.

  • **kwargs (Any | None) – Additional arguments passed to the underlying String type.

key

The encryption key.

Type:

str | bytes | Callable[[], str | bytes]

backend

The encryption backend instance.

Type:

EncryptionBackend

length

The unencrypted string length.

Type:

int | None

__init__(key=b"\x00\xfdj\xb5GK[^\x7f>\xb1'^\xf8\x98\x17)\xcc\x07\xde\xcd5\x01\xc5\x99\xd4\x93\xc8=\x98\xb2\xf3", backend=<class 'advanced_alchemy.types.encrypted_string.FernetBackend'>, length=None, **kwargs)[source]

Initializes the EncryptedString TypeDecorator.

Parameters:
  • key (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).

  • backend (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.

  • length (int | None) – The length of the unencrypted string. This is used for documentation and validation purposes only.

  • **kwargs (Any | None) – Additional arguments passed to the underlying String type.

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

impl

alias of String

load_dialect_impl(dialect)[source]

Loads the appropriate dialect implementation based on the database dialect.

Note: The actual column length will be larger than the specified length due to encryption overhead. For most encryption methods, the encrypted string will be approximately 1.35x longer than the original.

Parameters:

dialect (Dialect) – The SQLAlchemy dialect.

Returns:

The dialect-specific type descriptor.

Return type:

Any

mount_vault()[source]

Mounts the vault with the encryption key.

If the key is callable, it is called to retrieve the key. Otherwise, the key is used directly.

Return type:

None

process_bind_param(value, dialect)[source]

Processes the value before binding it to the SQL statement.

This method encrypts the value using the specified backend and validates length if specified.

Parameters:
  • value (Any) – The value to process.

  • dialect (Dialect) – The SQLAlchemy dialect.

Raises:

IntegrityError – If the unencrypted value exceeds the maximum length.

Returns:

The encrypted value or None if the input is None.

Return type:

str | None

process_result_value(value, dialect)[source]

Processes the value after retrieving it from the database.

This method decrypts the value using the specified backend.

Parameters:
  • value (Any) – The value to process.

  • dialect (Dialect) – The SQLAlchemy dialect.

Returns:

The decrypted value or None if the input is None.

Return type:

str | None

property python_type: type[str]

Returns the Python type for this type decorator.

Returns:

The Python string type.

Return type:

Type[str]

class advanced_alchemy.types.EncryptedText[source]

Bases: EncryptedString

SQLAlchemy TypeDecorator for storing encrypted text/CLOB values in a database.

This type provides transparent encryption/decryption of text values using the specified backend. It extends EncryptedString and implements Text as its underlying type. This is suitable for storing larger encrypted text content compared to EncryptedString.

Parameters:
  • key (str | bytes | Callable[[], str | bytes] | None) – The encryption key. Can be a string, bytes, or callable returning either. Defaults to os.urandom(32).

  • backend (Type[EncryptionBackend] | None) – The encryption backend class to use. Defaults to FernetBackend.

  • **kwargs (Any | None) – Additional arguments passed to the underlying String type.

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

impl

alias of Text

load_dialect_impl(dialect)[source]

Loads the appropriate dialect implementation for Text type.

Parameters:

dialect (Dialect) – The SQLAlchemy dialect.

Returns:

The dialect-specific Text type descriptor.

Return type:

Any

class advanced_alchemy.types.EncryptionBackend[source]

Bases: ABC

Abstract base class for encryption backends.

This class defines the interface that all encryption backends must implement. Concrete implementations should provide the actual encryption/decryption logic.

passphrase

The encryption passphrase used by the backend.

Type:

bytes

abstractmethod decrypt(value)[source]

Decrypts the given value.

Parameters:

value (Any) – The value to decrypt.

Returns:

The decrypted value.

Return type:

str

Raises:

NotImplementedError – If the method is not implemented by the subclass.

abstractmethod encrypt(value)[source]

Encrypts the given value.

Parameters:

value (Any) – The value to encrypt.

Returns:

The encrypted value.

Return type:

str

Raises:

NotImplementedError – If the method is not implemented by the subclass.

abstractmethod init_engine(key)[source]

Initializes the encryption engine with the provided key.

Parameters:

key (bytes | str) – The encryption key.

Raises:

NotImplementedError – If the method is not implemented by the subclass.

mount_vault(key)[source]

Mounts the vault with the provided encryption key.

Parameters:

key (str | bytes) – The encryption key used to initialize the backend.

class advanced_alchemy.types.FernetBackend[source]

Bases: EncryptionBackend

Fernet-based encryption backend.

This backend uses the Python cryptography library’s Fernet implementation for encryption/decryption operations. Provides symmetric encryption with built-in rotation support.

key

The base64-encoded key used for encryption and decryption.

Type:

bytes

fernet

The Fernet instance used for encryption/decryption.

Type:

cryptography.fernet.Fernet

decrypt(value)[source]

Decrypts the given value using Fernet.

Parameters:

value (Any) – The value to decrypt.

Returns:

The decrypted value.

Return type:

str

encrypt(value)[source]

Encrypts the given value using Fernet.

Parameters:

value (Any) – The value to encrypt.

Returns:

The encrypted value.

Return type:

str

init_engine(key)[source]

Initializes the Fernet engine with the provided key.

Parameters:

key (bytes | str) – The encryption key.

mount_vault(key)[source]

Mounts the vault with the provided encryption key.

This method hashes the key using SHA256 before initializing the engine.

Parameters:

key (str | bytes) – The encryption key.

class advanced_alchemy.types.FileObject[source]

Bases: object

Represents file metadata during processing using a dataclass structure.

This class provides a unified interface for handling file metadata and operations across different storage backends.

Content or a source path can optionally be provided during initialization via kwargs, store it internally, and add save/save_async methods to persist this pending data using the configured backend.

__eq__(other)[source]

Check equality based on filename and backend key.

Parameters:

other (object) – The object to compare with.

Returns:

True if the objects are equal, False otherwise.

Return type:

bool

__hash__()[source]

Return a hash based on filename and backend key.

Return type:

int

__init__(backend, filename, to_filename=None, content_type=None, size=None, last_modified=None, checksum=None, etag=None, version_id=None, metadata=None, source_path=None, content=None)[source]

Perform post-initialization validation and setup.

Handles default path, content type guessing, backend protocol inference, and processing of ‘content’ or ‘source_path’ from extra kwargs.

Raises:

ValueError – If filename is not provided, size is negative, backend/protocol mismatch, or both ‘content’ and ‘source_path’ are provided.

__repr__()[source]

Return a string representation of the FileObject.

Return type:

str

property backend: StorageBackend
property checksum: str | None
property content_type: str
delete()[source]

Delete the file from storage.

Raises:

RuntimeError – If no backend is configured or path is missing.

Return type:

None

async delete_async()[source]

Delete the file from storage asynchronously.

Return type:

None

property etag: str | None
property filename: str
get_content(*, options=None)[source]

Get the file content from the storage backend.

Parameters:

options – Optional backend-specific options.

Returns:

The file content.

Return type:

bytes

async get_content_async(*, options=None)[source]

Get the file content from the storage backend asynchronously.

Parameters:

options – Optional backend-specific options.

Returns:

The file content.

Return type:

bytes

property has_pending_data: bool
property last_modified: float | None
property metadata: dict[str, Any]
property path: str
property protocol: str
save(data=None, *, use_multipart=None, chunk_size=5242880, max_concurrency=12)[source]

Save data to the storage backend using this FileObject’s metadata.

If data is provided, it is used directly. If data is None, checks internal source_content or source_path. Clears pending attributes after successful save.

Parameters:
  • data (Union[IO[bytes], Path, bytes, Iterator[bytes], Iterable[bytes], None]) – Optional data to save (bytes, iterator, file-like, Path). If None, internal pending data is used.

  • use_multipart (Optional[bool]) – Passed to the backend’s save method.

  • chunk_size (int) – Passed to the backend’s save method.

  • max_concurrency (int) – Passed to the backend’s save method.

Return type:

FileObject

Returns:

The updated FileObject instance returned by the backend.

Raises:

TypeError – If trying to save async data synchronously.

async save_async(data=None, *, use_multipart=None, chunk_size=5242880, max_concurrency=12)[source]

Save data to the storage backend asynchronously.

If data is provided, it is used directly. If data is None, checks internal source_content or source_path. Clears pending attributes after successful save. Uses asyncio.to_thread for reading source_path if backend doesn’t handle Path directly.

Parameters:
Return type:

FileObject

Returns:

The updated FileObject instance returned by the backend.

Raises:

TypeError – If trying to save sync data asynchronously.

sign(*, expires_in=None, for_upload=False)[source]

Generate a signed URL for the file.

Parameters:
  • expires_in – Optional expiration time in seconds.

  • for_upload – Whether the URL is for upload.

Raises:

RuntimeError – If no signed URL is generated.

Returns:

The signed URL.

Return type:

str

async sign_async(*, expires_in=None, for_upload=False)[source]

Generate a signed URL for the file asynchronously.

Parameters:
  • expires_in – Optional expiration time in seconds.

  • for_upload – Whether the URL is for upload.

Returns:

The signed URL.

Return type:

str

Raises:

RuntimeError – If no signed URL is generated.

property size: int | None
to_dict()[source]

Convert FileObject to a dictionary for storage or serialization.

Note: The ‘backend’ attribute is intentionally excluded as it’s often

not serializable or relevant for storage representations. The ‘extra’ dict is included.

Returns:

A dictionary representation of the file information.

Return type:

dict[str, Any]

update_metadata(metadata)[source]

Update the file metadata.

Parameters:

metadata (dict[str, typing.Any]) – New metadata to merge with existing metadata.

Return type:

None

property version_id: str | None
class advanced_alchemy.types.MutableList[source]

Bases: MutableList[T]

A list type that implements Mutable.

The MutableList object implements a list that will emit change events to the underlying mapping when the contents of the list are altered, including when values are added or removed.

This is a replication of default Mutablelist provide by SQLAlchemy. The difference here is the properties _removed which keep every element removed from the list in order to be able to delete them after commit and keep them when session rolled back.

__delitem__(index)[source]

Detect list del events and emit change events.

Return type:

None

__init__(*args, **kwargs)[source]
__setitem__(index, value)[source]

Detect list set events and emit change events.

Return type:

None

append(x)[source]

Append object to the end of the list.

Return type:

None

clear()[source]

Remove all items from list.

Return type:

None

classmethod coerce(key, value)[source]

Convert plain list to instance of this class.

Return type:

typing.Any

extend(x)[source]

Extend list by appending elements from the iterable.

Return type:

None

insert(i, x)[source]

Insert object before index.

Return type:

None

pop(*arg)[source]

Remove and return item at index (default last).

Raises IndexError if list is empty or index is out of range.

Return type:

TypeVar(T, bound= Any)

remove(i)[source]

Remove first occurrence of value.

Raises ValueError if the value is not present.

Return type:

None

reverse()[source]

Reverse IN PLACE.

Return type:

None

sort(**kw)[source]

Sort the list in ascending order and return None.

The sort is in-place (i.e. the list itself is modified) and stable (i.e. the order of two equal elements is maintained).

If a key function is given, apply it once to each list item and sort them, ascending or descending, according to their function values.

The reverse flag can be set to sort in descending order.

Return type:

None

class advanced_alchemy.types.PGCryptoBackend[source]

Bases: EncryptionBackend

PostgreSQL pgcrypto-based encryption backend.

This backend uses PostgreSQL’s pgcrypto extension for encryption/decryption operations. Requires the pgcrypto extension to be installed in the database.

passphrase

The base64-encoded passphrase used for encryption and decryption.

Type:

bytes

decrypt(value)[source]

Decrypts the given value using pgcrypto.

Parameters:

value (Any) – The value to decrypt.

Returns:

The decrypted value.

Return type:

str

encrypt(value)[source]

Encrypts the given value using pgcrypto.

Parameters:

value (Any) – The value to encrypt.

Returns:

The encrypted value.

Return type:

str

init_engine(key)[source]

Initializes the pgcrypto engine with the provided key.

Parameters:

key (bytes | str) – The encryption key.

class advanced_alchemy.types.StorageBackend[source]

Bases: ABC

Unified protocol for storage backend implementations supporting both sync and async operations.

__init__(key, fs, **kwargs)[source]

Initialize the storage backend.

Parameters:
  • key (str) – The key of the backend instance

  • fs (Any) – The filesystem or storage client

  • **kwargs (Any) – Additional keyword arguments

abstractmethod delete_object(paths)[source]

Delete one or more files.

Parameters:

paths (Union[str, Path, PathLike[Any], Sequence[Union[str, Path, PathLike[Any]]]]) – Path or paths to delete

Return type:

None

abstractmethod async delete_object_async(paths)[source]

Delete one or more files asynchronously.

Parameters:

paths (Union[str, Path, PathLike[Any], Sequence[Union[str, Path, PathLike[Any]]]]) – Path or paths to delete

Return type:

None

abstractmethod get_content(path, *, options=None)[source]

Get the content of a file.

Parameters:
Returns:

The file content

Return type:

bytes

abstractmethod async get_content_async(path, *, options=None)[source]

Get the content of a file asynchronously.

Parameters:
Returns:

The file content

Return type:

bytes

abstractmethod save_object(file_object, data, *, use_multipart=None, chunk_size=5242880, max_concurrency=12)[source]

Store a file using information from a FileObject.

Parameters:
  • file_object – A FileObject instance containing metadata like path, content_type.

  • data – The file data to store.

  • use_multipart – Whether to use multipart upload.

  • chunk_size – Size of chunks for multipart upload.

  • max_concurrency – Maximum number of concurrent uploads.

Returns:

The stored file object, potentially updated with backend info (size, etag, etc.).

Return type:

FileObject

abstractmethod async save_object_async(file_object, data, *, use_multipart=None, chunk_size=5242880, max_concurrency=12)[source]

Store a file asynchronously using information from a FileObject.

Parameters:
Returns:

The stored file object, potentially updated with backend info (size, etag, etc.).

Return type:

FileObject

abstractmethod sign(paths, *, expires_in=None, for_upload=False)[source]

Generate a signed URL for one or more files.

Parameters:
Returns:

The signed URL

Return type:

str

abstractmethod async sign_async(paths, *, expires_in=None, for_upload=False)[source]

Generate a signed URL for one or more files asynchronously.

Parameters:
Returns:

The signed URL

Return type:

str

driver: str

The name of the storage backend.

protocol: str

The protocol used by the storage backend.

key: str

The key of the backend instance.

class advanced_alchemy.types.StorageRegistry[source]

Bases: object

A provider for creating and managing threaded portals.

__init__(json_serializer=<function encode_json>, json_deserializer=<built-in method decode of msgspec.json.Decoder object>, default_backend='advanced_alchemy.types.file_object.backends.obstore.ObstoreBackend')[source]

Initialize the PortalProvider.

clear_backends()[source]

Clear the registry.

Return type:

None

get_backend(key)[source]

Retrieve a configured storage backend from the registry.

Returns:

The storage backend associaStorageBackendiven key.

Return type:

StorageBackend

Raises:

ImproperConfigurationError – If no storage backend is registered with the given key.

is_registered(key)[source]

Check if a storage backend is registered in the registry.

Parameters:

key (str) – The key of the storage backend

Returns:

True if the storage backend is registered, False otherwise.

Return type:

bool

register_backend(value, key=None)[source]

Register a new storage backend in the registry.

Parameters:
  • value – The storage backend to register.

  • key – The key to register the storage backend with.

Raises:

ImproperConfigurationError – If a string value is provided without a key.

registered_backends()[source]

Return a list of all registered keys.

Return type:

list[str]

set_default_backend(default_backend)[source]

Set the default storage backend.

Parameters:

default_backend – The default storage backend

unregister_backend(key)[source]

Unregister a storage backend from the registry.

Return type:

None

class advanced_alchemy.types.StoredObject[source]

Bases: TypeDecorator

Custom SQLAlchemy type for storing single or multiple file metadata.

Stores file metadata in JSONB and handles file validation, processing, and storage operations through a configured storage backend.

__init__(backend, multiple=False, *args, **kwargs)[source]

Initialize StoredObject type.

Parameters:
  • backend (Union[str, StorageBackend]) – Key to retrieve the backend or from the storage registry or storage backend to use.

  • multiple (bool) – If True, stores a list of files; otherwise, a single file.

  • *args (typing.Any) – Additional positional arguments for TypeDecorator.

  • **kwargs (typing.Any) – Additional keyword arguments for TypeDecorator.

property backend: StorageBackend

Resolves and returns the storage backend instance.

cache_ok: Optional[bool] = True

Indicate if statements using this ExternalType are “safe to cache”.

The default value None will emit a warning and then not allow caching of a statement which includes this type. Set to False to disable statements using this type from being cached at all without a warning. When set to True, the object’s class and selected elements from its state will be used as part of the cache key. For example, using a TypeDecorator:

class MyType(TypeDecorator):
    impl = String

    cache_ok = True

    def __init__(self, choices):
        self.choices = tuple(choices)
        self.internal_only = True

The cache key for the above type would be equivalent to:

>>> MyType(["a", "b", "c"])._static_cache_key
(<class '__main__.MyType'>, ('choices', ('a', 'b', 'c')))

The caching scheme will extract attributes from the type that correspond to the names of parameters in the __init__() method. Above, the “choices” attribute becomes part of the cache key but “internal_only” does not, because there is no parameter named “internal_only”.

The requirements for cacheable elements is that they are hashable and also that they indicate the same SQL rendered for expressions using this type every time for a given cache value.

To accommodate for datatypes that refer to unhashable structures such as dictionaries, sets and lists, these objects can be made “cacheable” by assigning hashable structures to the attributes whose names correspond with the names of the arguments. For example, a datatype which accepts a dictionary of lookup values may publish this as a sorted series of tuples. Given a previously un-cacheable type as:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    this is the non-cacheable version, as "self.lookup" is not
    hashable.

    """

    def __init__(self, lookup):
        self.lookup = lookup

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self.lookup" ...

Where “lookup” is a dictionary. The type will not be able to generate a cache key:

>>> type_ = LookupType({"a": 10, "b": 20})
>>> type_._static_cache_key
<stdin>:1: SAWarning: UserDefinedType LookupType({'a': 10, 'b': 20}) will not
produce a cache key because the ``cache_ok`` flag is not set to True.
Set this flag to True if this type object's state is safe to use
in a cache key, or False to disable this warning.
symbol('no_cache')

If we did set up such a cache key, it wouldn’t be usable. We would get a tuple structure that contains a dictionary inside of it, which cannot itself be used as a key in a “cache dictionary” such as SQLAlchemy’s statement cache, since Python dictionaries aren’t hashable:

>>> # set cache_ok = True
>>> type_.cache_ok = True

>>> # this is the cache key it would generate
>>> key = type_._static_cache_key
>>> key
(<class '__main__.LookupType'>, ('lookup', {'a': 10, 'b': 20}))

>>> # however this key is not hashable, will fail when used with
>>> # SQLAlchemy statement cache
>>> some_cache = {key: "some sql value"}
Traceback (most recent call last): File "<stdin>", line 1,
in <module> TypeError: unhashable type: 'dict'

The type may be made cacheable by assigning a sorted tuple of tuples to the “.lookup” attribute:

class LookupType(UserDefinedType):
    """a custom type that accepts a dictionary as a parameter.

    The dictionary is stored both as itself in a private variable,
    and published in a public variable as a sorted tuple of tuples,
    which is hashable and will also return the same value for any
    two equivalent dictionaries.  Note it assumes the keys and
    values of the dictionary are themselves hashable.

    """

    cache_ok = True

    def __init__(self, lookup):
        self._lookup = lookup

        # assume keys/values of "lookup" are hashable; otherwise
        # they would also need to be converted in some way here
        self.lookup = tuple((key, lookup[key]) for key in sorted(lookup))

    def get_col_spec(self, **kw):
        return "VARCHAR(255)"

    def bind_processor(self, dialect): ...  # works with "self._lookup" ...

Where above, the cache key for LookupType({"a": 10, "b": 20}) will be:

>>> LookupType({"a": 10, "b": 20})._static_cache_key
(<class '__main__.LookupType'>, ('lookup', (('a', 10), ('b', 20))))

Added in version 1.4.14: - added the cache_ok flag to allow some configurability of caching for TypeDecorator classes.

Added in version 1.4.28: - added the ExternalType mixin which generalizes the cache_ok flag to both the TypeDecorator and UserDefinedType classes.

impl: Union[TypeEngine[Any], Type[TypeEngine[Any]]] = JSON()
process_bind_param(value, dialect)[source]

Convert FileObject(s) to JSON representation for the database.

Injects the configured backend into the FileObject before conversion.

Note: This method expects an already processed FileInfo or its dict representation.

Use handle_upload() or handle_upload_async() for processing raw uploads.

Parameters:
  • value – The value to process

  • dialect – The SQLAlchemy dialect

Raises:

TypeError – If the input value is not a FileObject or a list of FileObjects.

Returns:

A dictionary representing the file metadata, or None if the input value is None.

process_result_value(value, dialect)[source]

Convert database JSON back to FileObject or MutableList[FileObject].

Parameters:
  • value – The value to process

  • dialect – The SQLAlchemy dialect

Raises:

TypeError – If the input value is not a list of dicts.

Returns:

FileObject or MutableList[FileObject] or None.

property python_type: type[FileObject | list[FileObject] | set[FileObject] | MutableList[FileObject] | None]

Specifies the Python type used, accounting for the multiple flag.

property storage_key: str

Returns the storage key from the resolved backend.

multiple: bool