Detailed documentation for all scripts in the Fluxspace Core pipeline
Follow these steps to process data from collection to visualization
python3 scripts/mag_to_csv.py
# Output: data/raw/mag_data.csvpython3 scripts/validate_and_diagnosticsV1.py --in data/raw/mag_data.csv --drop-outliers
# Output: data/processed/mag_data_clean.csv + diagnosticspython3 scripts/compute_local_anomaly_v2.py --in data/processed/mag_data_clean.csv --radius 0.30 --plot
# Output: data/processed/mag_data_anomaly.csvpython3 scripts/interpolate_to_heatmapV1.py --in data/processed/mag_data_anomaly.csv --value-col local_anomaly --grid-step 0.05
# Output: data/exports/mag_data_grid.csv + mag_data_heatmap.pngCollect magnetic field measurements from an MMC5983MA magnetometer sensor and save them to CSV
Connects to MMC5983MA sensor via I2C and operates in auto-grid mode, automatically generating a grid of measurement points. At each grid point, prompts user to move sensor and press Enter. Takes multiple samples per point and averages them for accuracy.
data/raw/mag_data.csv (or custom path)
python3 scripts/mag_to_csv.pyValidate, clean, and generate diagnostics for magnetometer CSV data
Validates CSV structure and required columns (x, y, B_total). Cleans missing/invalid data and detects outliers using robust z-score statistics (MAD-based). Detects spikes (sudden changes between consecutive measurements) and generates comprehensive diagnostic plots and reports.
data/processed/<stem>_clean.csv, <stem>_report.txt, and diagnostic plots
python3 scripts/validate_and_diagnosticsV1.py --in data/raw/mag_data.csv --drop-outliers --z-thresh 5.0



Detect local magnetic anomalies by comparing each point to its neighborhood rather than the global average
For each point, finds all neighbors within a specified radius and computes local mean B_total from neighbors. Calculates anomaly as: local_anomaly = B_total - local_mean. Optionally filters out flagged rows (outliers/spikes) and adds three anomaly columns.
data/processed/<input_stem>_anomaly.csv
python3 scripts/compute_local_anomaly_v2.py --in data/processed/mag_data_clean.csv --radius 0.30 --plotInterpolate scattered measurement points onto a regular grid and generate heatmap visualizations
Takes scattered (x, y, value) points from CSV and interpolates values onto a regular grid using IDW (Inverse Distance Weighting). Exports grid data as CSV and generates heatmap PNG visualization with configurable grid resolution and interpolation power.
data/exports/<stem>_grid.csv, <stem>_heatmap.png
python3 scripts/interpolate_to_heatmapV1.py --in data/processed/mag_data_anomaly.csv --value-col local_anomaly --grid-step 0.05
The heatmap visualizes local anomalies with color gradients: yellow/red for high positive anomalies, green for neutral, and blue/purple for low negative anomalies.
The pipeline follows a clear data flow through organized directories
data/
├── raw/ # Original sensor data (from mag_to_csv.py)
├── processed/ # Cleaned and analyzed data (from validate + anomaly scripts)
└── exports/ # Final outputs (grids, heatmaps)
Flow: raw/ → processed/ → exports/mag_to_csv.py uses an auto-grid system where you configure:
The script automatically calculates each grid point and prompts you to move the sensor there.
validate_and_diagnosticsV1.py adds flag columns to identify problematic data:
These flags can be used to filter data in subsequent steps.
Unlike global anomalies (comparing to overall mean), local anomalies compare each point to its nearby neighbors. This helps detect:
Inverse Distance Weighting assigns values to grid points based on:
Original version of local anomaly computation (simpler, no CLI). Status: Superseded by compute_local_anomaly_v2.py (recommended to use v2)
Placeholder - functionality to be implemented
Placeholder - functionality to be implemented
--help flag for argument information<stem>_clean.csv, <stem>_anomaly.csv)