The Market Data, Trade Data and Reference Data used to build these feeds and valuations are collected from various golden sources.The market data keeps changing during the course of the day – this makes capturing a snapshot of the market data in the current system difficult - this in turn impacts the ability to generate risk feeds and valuations for decision making in a timely manner. Assets may have complex relationships and nested structures, necessitating graph based modeling.
"A RISK VALUATION AND FEED GENERATION SYSTEM"
Solution:
- Use Databricks / Snowflake to ingest, transform and model data
- Since upstream systems cannot preload market data based on the previous days feeds, snapshots of these market data are used for Risk Calculation to create an accurate Risk Valuation and Risk feed.
- This is done based as a solution leveraging Microservices.
Technical Solution
- During peak time, spark based data framework can scale to ensure efficient processing of data.
- The system built is Cloud ready and deployed on a public cloud
- Built an Analytic Engine for breaking a large book of trade into separate tasks - process these tasks gathering information from various sources and provide a timely Risk Calculation
Technical Implementation
- DevOps and CI/CD principles for zero down time for deployments.
- The Atyeti team designed a configurable system from batch to mini batch and stream processing architectures for the subscription and registration of data.
- The Risk calculation process was refactored out of the application and built as an independently deployable and scalable unit
- Build the application Cloud ready and Cloud Agnostic
- Uniqueness: De-duplication and redundancy elimination
Data Services
- Built the application using Spark for Scalability, Resilience and Fault tolerance
- Task parallelization on Spark using Ray.io
- Deployed CI/CD using Jenkins
- Deployed Industry standards and best practices in developing Cloud ready solution
Result
- The Application can now generate Risk feeds in near Real-Time