NOTICE: All information contained herein is, and remains
the property of TechnoCore.
The intellectual and technical concepts contained
herein are proprietary to TechnoCore and dissemination of this information or reproduction of this material
is strictly forbidden unless prior written permission is obtained
from TechnoCore.
¶ Axion Architecture and Governance for package−client
The Axion system is a modular, service-oriented architecture designed for robust data processing, workflow automation, and secure integration for package−client. Built on a Python backend, it leverages a polyglot persistence model (MariaDB and MongoDB) and uses RabbitMQ for resilient, asynchronous communication. The architecture is designed from the ground up to be compliant with a comprehensive suite of governance policies, including Risk Management, Data Provenance and Lineage, and POPIA/GDPR.
Security is paramount, with controls such as a Web Application Firewall (WAF), 128-bit asymmetric encryption, and client-specific keys managed in AWS Secrets Manager. The system's design directly supports our Ransomware Protection Strategy through immutable backups and our DDoS Protection Strategy via layered network defenses. Deployed on AWS EC2, the architecture provides a scalable, secure, and auditable platform for package−client's business processes.
The design of the Axion system is guided by the following principles:
- Service-Oriented Architecture (SOA): Promotes modularity, making the system easier to develop, deploy, and maintain.
- Asynchronous Communication: Decouples services using RabbitMQ, improving fault tolerance, scalability, and responsiveness.
- Polyglot Persistence: Uses MariaDB and MongoDB to best fit the needs of structured and unstructured data for package−client.
- Centralized Configuration: Manages configuration centrally to ensure consistency across different environments.
- Security and Compliance by Design: Integrates security and data governance into every layer of the architecture, ensuring that the system inherently operates in alignment with our formal policies.
¶ 3. Security and Compliance by Design
This section details how the architecture implements the core tenets of our governance policies.
¶ 3.1. Data Governance and Lineage
In alignment with the Data Provenance and Lineage Policy, the architecture ensures full traceability of data belonging to package−client.
- Correlation IDs: Every request is tagged with a unique ID that is passed between services, enabling an end-to-end audit trail.
- MQTT for Telemetry: All significant events—data ingestion, transformation, access, and errors—are published as messages to dedicated MQTT topics.
- Structured Logging: Services generate structured logs that are forwarded to InfluxDB, providing a searchable, immutable record of all activities. This allows for the generation of lineage reports for auditing and compliance purposes.
¶ 3.2. Resilience and Ransomware Protection
The architecture provides robust protection against data loss, aligning with our Backup and Recovery Policy and Ransomware Protection Strategy.
- Automated Cloud Backups: The use of AWS RDS for MariaDB provides automated rolling snapshots, which are protected by IAM policies to prevent unauthorized deletion.
- Immutable Offsite Backups: Database dumps and critical files are backed up to S3 buckets with Object Lock enabled. This creates immutable, versioned copies of data that are resilient to ransomware.
- On-Premise Resilience: For development and testing, Proxmox snapshots provide a rapid method for system recovery.
¶ 3.3. DDoS Mitigation and Network Security
To ensure service availability for package−client, the architecture implements the DDoS Protection Strategy at multiple layers.
- Web Application Firewall (WAF): A ModSecurity WAF with the OWASP Core Rule Set is deployed at the edge to filter malicious traffic, including SQL injection and cross-site scripting attempts.
- IP Whitelisting and Rate Limiting: The Apache reverse proxy and UFW are configured to enforce strict IP whitelisting and connection rate limits, mitigating volumetric attacks.
- Payload Validation: The
ServeWebHook.py service performs rigorous validation on all incoming data payloads, rejecting malformed or oversized requests before they enter the system.
¶ 3.4. Risk Management and Monitoring
The architecture provides the necessary visibility to support our Risk Management Policy.
- Centralized Monitoring: System metrics, logs, and security events are collected in InfluxDB and visualized in dashboards.
- Proactive Alerting: Threshold-based alerts are configured to notify the operations team of anomalies, such as unusual traffic patterns or high error rates, enabling proactive risk mitigation.
¶ 4. System Components and Workflow
ServeWebsite.py: The primary web interface for package−client's users, protected by the WAF.
ServeWebHook.py: The API entry point for external systems. It is a key security checkpoint, responsible for request validation and rate limiting.
ServeMqtt.py: Manages real-time communication and publishes telemetry data for monitoring and lineage tracking.
ServeWorkflow.py: The core business logic engine, which consumes messages from RabbitMQ and orchestrates workflows.
ServeReport.py: Generates reports for package−client, with access controls to ensure data segregation.
ServeImport.py: Handles bulk data imports, with validation and error logging.
factory.core:
ObjWorkflow.py (Workflow Engine - DAGs): At the heart of the system is a powerful workflow engine that models business processes as Directed Acyclic Graphs (DAGs). Each workflow is a series of nodes (steps) connected by directed edges (dependencies), ensuring a clear, acyclic flow from start to finish. The engine interprets these definitions from the database to execute tasks like calculations, service calls, and notifications in the correct order.
ObjConversation.py (Conversation Engine): This provides a flexible, data-driven engine for creating interactive, stateful conversations across multiple channels (e.g., Console, WhatsApp). It uses a modular design where each channel is a separate, pluggable interface, allowing for easy extension to new platforms.
ObjData.py: The Data Access Object (DAO) that provides a secure abstraction layer over the databases, preventing direct, untrusted queries.
ObjApi.py: The base class for business logic objects, providing a common interface.
ConfigIni.py: Provides standardized access to configuration from resource.config/.
- MariaDB (on AWS RDS): The primary relational database for structured data. Its deployment on RDS ensures high availability and automated backups.
- MongoDB: The NoSQL data store for unstructured data, such as JSON payloads and logs.
- RabbitMQ: The asynchronous backbone of the system, decoupling services and providing resilience against service failures.
- Production: Deployed on AWS EC2 in a region selected to comply with the data residency requirements of package−client.
- Development & CI/CD: A local Proxmox environment is used for development and testing.
- Local Development: All local development is deployed to the AWS
af-south-1 (Cape Town) region to adhere to data sovereignty requirements.
graph TD
subgraph "User & External Systems"
U[/"User via Browser"/]
ES[/"External Systems (API, Webhooks)"/]
end
subgraph "Network Edge"
WAF[("ModSecurity WAF")]
end
subgraph "Axion Services (Entry Points)"
direction LR
SW[ServeWebsite.py]
SWH[ServeWebHook.py]
SMQ[ServeMqtt.py]
SWF[ServeWorkflow.py]
end
subgraph "Core Logic (factory.core)"
FC[ObjApi / ObjData]
end
subgraph "Messaging & Telemetry"
RMQ[("RabbitMQ")]
MQTT[("MQTT Broker")]
end
subgraph "Data Persistence & Backups"
direction LR
MARIA[("MariaDB on AWS RDS")]
MONGO[("MongoDB")]
S3[("S3 with Object Lock")]
end
subgraph "Monitoring"
INFLUX[("InfluxDB")]
end
%% Connections
U --> WAF --> SW
ES --> WAF --> SWH
SW --> FC
SWH --> FC
SMQ --> FC
SWF --> FC
FC --> MARIA
FC --> MONGO
MARIA --> S3
MONGO --> S3
SWH --> RMQ
SWF --> RMQ
SMQ -- Telemetry --> MQTT
FC -- Events --> MQTT
MQTT -- Logs --> INFLUX