Delta Hedging & Volatility Trading Simulator
GitHubOverview
A dynamic delta hedging system for a short NVDA European call option, replicating option exposure using the underlying asset and cash.
The system continuously recalibrates delta using Black-Scholes-Merton Greeks and evaluates hedging performance under discrete-time execution and market noise.
Trading Problem
Short volatility strategies depend on managing gamma risk under imperfect replication and discrete execution.
Core Idea
A short call option position is dynamically hedged using delta replication to study:hedging error gamma exposure volatility sensitivity PnL path dependence
Strategy (Short Volatility Exposure)
The portfolio is structured as a short call position:
- Short gamma → exposure to convex losses during large moves
- Short vega → benefits from volatility compression
- Long theta → captures time decay
This reflects a volatility risk premium strategy, where returns come from option selling while delta risk is dynamically hedged.
Hedging Framework
At each time step:
- Compute option delta using Black-Scholes-Merton
- Rebalance underlying position to maintain delta neutrality
- Adjust cash position under a self-financing constraint
- Track portfolio value and PnL evolution
This approximates a continuous-time hedging strategy using discrete execution intervals, introducing realistic replication error.
Trading Mapping
This replicates:
- Market making with dynamic hedging
- Short volatility exposure
- Gamma risk management
- Execution-constrained replication
Key Insights
- Discrete hedging introduces structural replication error
- Gamma risk dominates near expiry
- Volatility mis-specification is a primary PnL driver
- PnL is path-dependent, not state-dependent
Core Takeaway
Delta hedging removes directional exposure but does not eliminate risk.
Real-Time Delta Hedging Simulation
Real-time simulation of portfolio rebalancing, gamma exposure, and hedging error under non-continuous execution constraints.
Live Portfolio Evolution under Dynamic Delta Rebalancing
Automated hedging system updating portfolio state in real time, including delta recalculation, underlying adjustments, and PnL evolution under discrete execution intervals.
Market Data Pipeline - Exchange Feed Ingestion System
GitHubOverview
A market data ingestion system that transforms raw XML exchange feeds into structured inputs for quantitative trading models.
The system ensures that trading and pricing models operate on consistent, reliable market data, where input quality directly impacts signal stability, pricing accuracy, and execution decisions.
Trading Problem
Market data quality is a first-order driver of trading performance.Small inconsistencies propagate into pricing, risk, and execution errors.
Core Idea
This system transforms raw exchange data into consistent, model-ready inputs for trading systems.
Trading Role in System
Market Data Feed → Structured Inputs → Trading Decisions
Key Components
- Extraction of raw XML exchange feeds
- Structured transformation (ETL pipeline)
- Normalization of market variables
- Persistent storage for reproducibility
- Scheduled updates for time-series consistency
Trading Mapping
This layer supports:
- Volatility modeling
- Risk calculations
- Execution strategies
- Microstructure models
Key Insights
- Data consistency is critical for trading correctness
- Small schema inconsistencies amplify into PnL distortion
- Most model failures originate from data, not modeling
- Market data is a first-order driver of strategy quality
Core Takeaway
Trading systems are only as reliable as their data infrastructure.
Market Data Ingestion Pipeline Outputs
Automated Market Data Ingestion
End-to-end pipeline execution demonstrating structured extraction, transformation, and processing of market data for downstream trading system consumption.
Scheduled Market Data Refresh System for Streaming Market Simulation
Cron-based execution framework enabling periodic ingestion cycles, simulating real-time market data updates for downstream model responsiveness testing.
Bulk Market Data Ingestion into Persistent Trading Data Infrastructure
High-volume ingestion pipeline using BCP into Azure SQL for historical storage, risk analysis, and reproducible backtesting environments.
XML Market Data Parser & Transformation Pipeline
GitHubOverview
This project implements a market data transformation pipeline that converts raw XML-based financial feeds into structured, model-ready datasets used across pricing, risk, and execution systems.
Trading Problem
Raw market data is not usable for trading. It must be structurally consistent before it can feed pricing, risk, or execution models.
Key Function
- Converts hierarchical exchange data into tabular form
- Preserves schema relationships and traceability
- Normalizes inconsistent financial data structures
- Enables downstream model reliability
Trading System Mapping
- XML feed → raw exchange data
- Parser → market data ingestion layer
- Structured output → model-ready inputs
- Transformation → pre-pricing / pre-risk conditioning
Key Insights
- Market data is a structural problem, not a data problem
- Schema consistency determines downstream model reliability
- Parsing design affects system correctness
- Data preprocessing is a prerequisite for any trading strategy
Dual Parsing Logic (Robustness vs Control)
- High-level parsing for flexibility (xmltodict)
- Low-level streaming parsing for deterministic control (ElementTree)
Core Takeaway
Robust trading systems begin with correct data structure, not modeling complexity.
Hierarchical Market Data Structure (XML Feed)
XML-based market data transformation system converting nested exchange feeds into structured datasets for pricing, risk, and execution models.
Hierarchical financial data encoding demonstrating how raw exchange feeds represent instrument-level market structure used in downstream trading systems.