dal.plugins.persistence.redis package

Submodules

dal.plugins.persistence.redis.redis module

Copyright (C) Mov.ai - All Rights Reserved Unauthorized copying of this file, via any medium is strictly prohibited Proprietary and confidential

Developers: - Alexandre Pires (alexandre.pires@mov.ai) - 2020

class dal.plugins.persistence.redis.redis.RedisPlugin(**kwargs)

Bases: PersistencePlugin

This class implements the actual plugin that store on redis, maitaining compatibility with the V1, this means that when saving to the redis the key used for each attrinute will be composed of the following constrains:

  • When an attribute had the “value_on_key” flag it will be stored in the

the key composition, otherwise it will be stored as a value - Composition pattern: <scope>:<id>[(,<attr>:<id>)*](,<attr>)+:(<value>)*

backup(**kwargs)

archive a scope/scopes into a zip file

create_workspace(ref: str, **kwargs)

creates a new workspace, on the REDIS driver this is not supported by design

decode_hash(_hash)

Decodes a full hash from redis

decode_list(_list)

Decodes a full list from redis

decode_value(_value)

Decodes a value from redis

delete(data: object | None = None, **kwargs)

delete an object from the persistent layer, you may provide a ScopeInstanceVersionNode or use the following args; - scope - ref - data And optional: - schema_version

If the schema_version is not specified we will try to load it from redis, if it is missing the default will be 1.0

If data is not a dict this will erase all keys defined in schema, otherwise it will only delete the passed structure

delete_all_keys(schema: TreeNode, base: str, keys: list, conn: Redis)

Delete the object from redis, according the V1 specifications

delete_keys(schema: TreeNode, base: str, keys: list, conn: Redis, data: dict)

Delete some keys from the redis, according the V1 specifications

delete_workspace(ref: str)

deletes a existing workspace

fetch_keys(conn, scope: str, ref: str) list

Get keys using KEYS command

fetch_keys_iter(conn, scope: str, ref: str) list

Get keys using SCAN ITER command

Get a list of all related objects

get_scope_info(**kwargs)

get the information of a scope

key_to_dict(schema: TreeNode, key: str, conn, data)

convert a key in the V1 specfication to a dictonary, also loads the value from the Redis database

list_scopes(**kwargs)

list all existing scopes This might need to be changed in the future, for now it search all keys in redis, we might need in the future to use a “caching” mechanism, probably a set that is updated everytime a a scope is added/delete

list_versions(**kwargs)

list all existing scopes

list_workspaces()

list available workspaces, on the REDIS driver this is not supported by design

load_keys(schema: TreeNode, base: str, keys: list, conn: Redis, out: dict)

Save the object in the redis, according the V1 specifications

property plugin_name

Get current plugin class

property plugin_version

Get current plugin class

read(**kwargs)

load an object from the persistent layer, you must provide the following args - scope - ref The following argument is optional: - schema_version

If the schema_version is not specified we will try to load it from redis, if it is missing the default will be 1.0

rebuild_indexes(**kwargs)

force the database layer to rebuild all indexes, this is a costly operation

restore(**kwargs)

restore a scope/scopes from a zip file

save_keys(schema: TreeNode, base: str, keys: list, conn: Redis, data: dict)

Save the object in the redis, according the V1 specifications

schema_to_key(schema: TreeNode)

Convert a schema to a redis key according to the standard

abstract property versioning

returns if this plugin supports versioning

workspace_info(ref: str)

get information about a workspace

write(data: object, **kwargs)

Stores the object on the persistent layer, for now we only support ScopeInstanceVersionNode, and python dict.

if you pass a dict you must provide the following args:
  • scope

  • ref

  • schema_version

Currently a dict must be in one of the following form: - { <scope> : { {ref} : { <data> } } } - { <data> }

The data part must comply with the schema of the scope

Module contents

Copyright (C) Mov.ai - All Rights Reserved Unauthorized copying of this file, via any medium is strictly prohibited Proprietary and confidential

Developers: - Alexandre Pires (alexandre.pires@mov.ai) - 2020

class dal.plugins.persistence.redis.RedisPlugin(**kwargs)

Bases: PersistencePlugin

This class implements the actual plugin that store on redis, maitaining compatibility with the V1, this means that when saving to the redis the key used for each attrinute will be composed of the following constrains:

  • When an attribute had the “value_on_key” flag it will be stored in the

the key composition, otherwise it will be stored as a value - Composition pattern: <scope>:<id>[(,<attr>:<id>)*](,<attr>)+:(<value>)*

backup(**kwargs)

archive a scope/scopes into a zip file

create_workspace(ref: str, **kwargs)

creates a new workspace, on the REDIS driver this is not supported by design

decode_hash(_hash)

Decodes a full hash from redis

decode_list(_list)

Decodes a full list from redis

decode_value(_value)

Decodes a value from redis

delete(data: object | None = None, **kwargs)

delete an object from the persistent layer, you may provide a ScopeInstanceVersionNode or use the following args; - scope - ref - data And optional: - schema_version

If the schema_version is not specified we will try to load it from redis, if it is missing the default will be 1.0

If data is not a dict this will erase all keys defined in schema, otherwise it will only delete the passed structure

delete_all_keys(schema: TreeNode, base: str, keys: list, conn: Redis)

Delete the object from redis, according the V1 specifications

delete_keys(schema: TreeNode, base: str, keys: list, conn: Redis, data: dict)

Delete some keys from the redis, according the V1 specifications

delete_workspace(ref: str)

deletes a existing workspace

fetch_keys(conn, scope: str, ref: str) list

Get keys using KEYS command

fetch_keys_iter(conn, scope: str, ref: str) list

Get keys using SCAN ITER command

Get a list of all related objects

get_scope_info(**kwargs)

get the information of a scope

key_to_dict(schema: TreeNode, key: str, conn, data)

convert a key in the V1 specfication to a dictonary, also loads the value from the Redis database

list_scopes(**kwargs)

list all existing scopes This might need to be changed in the future, for now it search all keys in redis, we might need in the future to use a “caching” mechanism, probably a set that is updated everytime a a scope is added/delete

list_versions(**kwargs)

list all existing scopes

list_workspaces()

list available workspaces, on the REDIS driver this is not supported by design

load_keys(schema: TreeNode, base: str, keys: list, conn: Redis, out: dict)

Save the object in the redis, according the V1 specifications

property plugin_name

Get current plugin class

property plugin_version

Get current plugin class

read(**kwargs)

load an object from the persistent layer, you must provide the following args - scope - ref The following argument is optional: - schema_version

If the schema_version is not specified we will try to load it from redis, if it is missing the default will be 1.0

rebuild_indexes(**kwargs)

force the database layer to rebuild all indexes, this is a costly operation

restore(**kwargs)

restore a scope/scopes from a zip file

save_keys(schema: TreeNode, base: str, keys: list, conn: Redis, data: dict)

Save the object in the redis, according the V1 specifications

schema_to_key(schema: TreeNode)

Convert a schema to a redis key according to the standard

abstract property versioning

returns if this plugin supports versioning

workspace_info(ref: str)

get information about a workspace

write(data: object, **kwargs)

Stores the object on the persistent layer, for now we only support ScopeInstanceVersionNode, and python dict.

if you pass a dict you must provide the following args:
  • scope

  • ref

  • schema_version

Currently a dict must be in one of the following form: - { <scope> : { {ref} : { <data> } } } - { <data> }

The data part must comply with the schema of the scope