This guide explains how to configure a new file import process by adding an entry to the def_dataimport database table.
Every import process is defined by a single record in the def_dataimport table. The ObjDataImport service reads these records to understand where to look for files, how to parse them, and what to do with the data.
To create a new import, you need to insert a new row into def_dataimport with the following fields configured:
| Field | Purpose | Example |
|---|---|---|
DataimportCode |
A unique name/code for your import process. | DAILY_SALES_CSV |
Directory |
The absolute path to the directory to watch for new files. You can specify multiple directories separated by commas. | /home/axion/axion/local.documents/imports/sales |
Filename |
A file mask to identify which files to process. Uses standard glob patterns. | sales_report_*.csv |
Package |
The application package this import belongs to. | CORE |
| Field | Purpose | Example |
|---|---|---|
ImportType |
Specifies which parser to use. If left empty, it's inferred from the file extension (e.g., csv, xlsx). |
csv |
Delimiter |
(CSV only) The character used to separate fields. If empty, the parser will attempt to auto-detect it. | ; |
Quotechar |
(CSV only) The character used to enclose fields that may contain the delimiter. | " |
Encoding |
The character encoding of the source file. | utf-8 |
Sheetname |
(Excel only) The name of the sheet to import data from. | January Sales |
Sheetnumber |
(Excel only) The 1-based index of the sheet to import. | 1 |
| Field | Purpose | Example |
|---|---|---|
TableName |
The name of the final destination table where the data will be loaded. | fact_daily_sales |
Overwrite |
Set to Y to drop and recreate the destination table with each import. Set to N to append data. |
N |
ProcessedDir |
The directory where files are moved after being successfully imported. If left empty, defaults to a done subfolder in the source directory. |
/home/axion/axion/archive.documents/sales |
| Field | Purpose | Example |
|---|---|---|
PreSql |
A SQL script to execute before the import process begins. | TRUNCATE TABLE staging_sales; |
PostSql |
A SQL script to execute after the data has been successfully loaded into the final table. | CALL sp_process_daily_sales(); |
WorkflowName |
The name of a workflow (from def_workflow) to trigger after a successful import. |
SALES_DATA_VALIDATION |
Let's configure an import for daily sales reports. The files are CSVs named sales_YYYYMMDD.csv, they arrive in /home/axion/data/incoming/sales, and the data needs to be loaded into the sales_transactions table.
Step 1: Insert a new record into def_dataimport with the following values:
SALES_CSV_IMPORT/home/axion/data/incoming/salessales_*.csvCOREcsv,sales_transactionsN/home/axion/data/processed/salesUPDATE sales_transactions SET processed_flag = 1 WHERE processed_flag = 0;Step 2: Start the Import Service
Ensure the watch service is running:
python ServeImport.py watch
Step 3: Test the Import
Place a file named sales_20251019.csv into the /home/axion/data/incoming/sales directory.
Verification:
watch service log should show that it detected and processed the file.incoming directory to the processed directory.sales_transactions table.PostSql script should have executed, updating the processed_flag column.