The ObjBackup module provides a robust framework for creating and managing database backups. It is designed to handle backups for MariaDB/MySQL databases, allowing for flexible configuration of what to back up, where to store it, and how to manage the process. It supports backing up multiple databases from different remote connections, filtering tables, and managing the backup lifecycle.
The module consists of two main classes: ObjBackup for handling a single backup operation and BackupSet for managing and scheduling a collection of backup tasks.
The module defines several constants for consistent behavior:
PROGRESS_UPDATE_INTERVAL: Update interval for progress bars (2 seconds)MIN_VALID_BACKUP_SIZE: Minimum size for a valid backup file (4096 bytes)DEFAULT_BACKUP_TIME: Default time for scheduled backups ("20:00")STATUS_DONE: Success status constant ("DONE")STATUS_ERROR: Error status constant ("ERROR")SYSTEM_DATABASES: List of system databases to exclude from backupsThis class encapsulates the logic for a single database backup operation. It inherits from ObjApi.ObjApi and handles everything from reading the backup definition to creating the final compressed archive.
__init__(self, DB=0)Initializes a new ObjBackup instance. Initializes connection parameters, backup configuration, and param1-param9 using efficient loop-based assignment. The activated attribute is a boolean indicating whether a factory backup module has been loaded.
read_backup(self, guid: str = "", backupcode: str = "", database: str = "", param1-9: str = "")Reads the backup definition from the def_Backup table based on a backupcode and package. It sets up essential parameters like the backup folder and table masks. Parameters are assigned efficiently using a loop to reduce code redundancy.
guid: The unique identifier for the staged backup taskbackupcode: The name of the backup definition to usedatabase: The target database to back upparam1-param9: Nine configurable parameters for custom backup logicpack(self, context: dict, status='') -> Tuple[str, int]This is the core method that performs the backup. It connects to the remote database, determines which tables to back up based on the _Tablemask, and uses mysqldump to create a single SQL dump file. The file is then compressed into a .7z archive.
context: A dictionary used to pass state and results between components(status, size), where status is STATUS_DONE or STATUS_ERROR and size is the size of the final archive in bytesKey features of the pack method:
fnmatch to selectively include tables based on the _Tablemask (e.g., def_*, data_*)7z to compress the SQL dump, saving storage spacestage_backup and def_Backup tables using status constantsarchive(self, database_name: str)Dumps a specific database schema to a compressed file using primary credentials from config.yaml. Provides interactive prompts for dropping the database after successful backup. Archives are moved to archive.documents/backup/ directory if the database is dropped.
_init_default_attrs()Initializes default values for _Tablemask (wildcard *) and _Backupfolder if not set.
_normalize_tablemask()Normalizes table mask to wildcard * if empty or set to generic values (ALL, *, %).
_safe_get_filesize(file_path: str) -> intSafely retrieves file size, returning 0 if file doesn't exist.
_get_table_size(table: str) -> intRetrieves the size of a specific table in bytes from the database.
_get_charset_from_collation(collation: str) -> strExtracts charset from a collation string (e.g., utf8mb4_unicode_ci → utf8mb4).
_run_mysqldump_with_progress(cmd_args, output_file, total_bytes, database_name) -> boolExecutes mysqldump with real-time progress tracking. Returns True on success, False on failure. Handles subprocess management and error cleanup.
_get_connection_details(package: str)Retrieves and decrypts remote database connection details from either config.yaml (PRIMARY) or the database.
_setup_backup_folders() -> Tuple[str, str, str]Creates necessary folder structure for backups. Returns temp folder, build folder, and final folder paths. Uses os.makedirs(exist_ok=True) for efficient directory creation.
factory_backup(self, factory_module: str) -> Optional[Any]A factory method to load and instantiate custom backup processing logic from an external module. This allows for extending the backup process with custom pre- or post-processing steps. Returns None if no matching module is found, otherwise returns the factory object instance.
This class acts as a supervisor for managing multiple backup tasks. It inherits from ObjSupervisor.Supervisor and is responsible for staging and running backup jobs.
__init__(self, DB=0)Initializes a new BackupSet instance.
pre_stage(self, BackupName="%")Scans the def_Backup table for backup definitions and stages them for execution. It can discover databases on a remote connection and automatically create specific backup tasks for each one.
stage(self, remoteconnection: str, host: str, backup_name: str, database: str)Creates a new record in the stage_backup table, effectively scheduling a backup job to be run. It avoids creating duplicate jobs if a recent one already exists.
read(self, context, backup_name: str = "%")Reads the list of pending backup jobs from the stage_backup table and initiates the backup process for each one by creating and running an ObjBackup instance.
run_workflow_direct(self, context, ...)The main entry point for running the backup workflow. It calls pre_stage to schedule the jobs and then read to execute them.
Example Workflow:
run_workflow_direct.pre_stage is called, which looks at the def_Backup configurations.stage.stage creates entries in the stage_backup table for each database to be backed up.read queries stage_backup for pending jobs.ObjBackup instance and calls its pack method.pack method performs the actual backup and compression.The module includes a typer-based CLI for easy execution of backup tasks.
prestage: Scans for backup definitions and stages thembackupall: Runs the entire backup workflow for all defined backupsbackup [backupname]: Runs the workflow for a specific backup definitionselect: Interactive selection of backup compute resourcesarchive [database_name]: Interactively or directly archive a database schemaExample CLI Usage:
# Stage all backup jobs
python ObjBackup.py prestage
# Run all staged backup jobs
python ObjBackup.py backupall
# Run a specific backup job named 'daily_customer_db'
python ObjBackup.py backup daily_customer_db
# Archive a specific database
python ObjBackup.py archive customer_db
# Interactive archive selection
python ObjBackup.py archive
The module has been optimized for performance and maintainability:
setattr() for param1-param9any()Optional[Any] for factory methodsexist_ok=True to eliminate redundant checkscythonize -3 -a -i ObjBackup.py
Compiling /home/axion/projects/axion/factory.core/ObjBackup.py because it changed..[1/1] Cythonizing /home/axion/projects/axion/factory.core/ObjBackup.py./home/axion/projects/axion/dev-env/lib/python3.11/site-packages/setuptools/config/pyprojecttoml.py:108: _BetaConfiguration: Support for `[tool.setuptools]` in `pyproject.toml` is still *beta*.. warnings.warn(msg, _BetaConfiguration).error: could not create './factory/core/ObjBackup.cpython-311-x86_64-linux-gnu.so': No such file or directory
Updated : 2025-12-12