Data Engineer ID56374

Other Jobs To Apply

AgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. We rank among the leaders in areas like application development and AI/ML, and our people-first culture has earned us multiple Best Place to Work awards.<br><br><b>WHY JOIN US</b><br>If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!<br><br><b>ABOUT THE ROLE</b><br><br><br><b>WHAT YOU WILL DO</b><br>- Build and maintain scalable, distributed, fault-tolerant data pipelines using Microsoft Fabric;<br>- Develop and manage lakehouse layers and Delta Lake workflows for data processing;<br>- Collaborate with stakeholders across data engineering, compliance, and business teams;<br>- Design and implement pipelines to acquire, normalize, transform, and release large volumes of financial data;<br>- Design and implement bitemporal data models for regulatory-grade time-series datasets;<br>- Build and maintain testing frameworks for data pipelines and transformation logic;<br>- Own end-to-end solutions including ingestion pipelines, QA workflows, correction management, and audit trails;<br>- Contribute to shared platform services in a collaborative environment;<br>- Support implementation of AI solutions including data ingestion, anomaly detection, and semantic search.<br><br><b>MUST HAVES</b><br>- <b>6–8 years of experience in data engineering</b>;<br>- <b>Proficiency in Python</b> for data pipelines, transformation logic, and automation;<br>- <b>Proficiency in SQL</b> including window functions, partitioning, and time-series queries;<br>- <b>Hands-on experience with Microsoft Fabric</b> (OneLake, Data Factory, Lakehouse, Warehouse);<br>- <b>Working knowledge of Delta Lake</b> including incremental merges and Change Data Feed;<br>- <b>Experience with AI-assisted development tools</b> such as GitHub Copilot or similar;<br>- <b>Experience with Git</b> version control and collaboration workflows;<br>- Familiarity with REST APIs for integrations;<br>- Familiarity with Azure technologies (Azure Data Factory, Azure SQL, Azure Key Vault, RBAC);<br>- Understanding of financial data concepts related to equities and & other asset classes;<br>- <b>Upper-intermediate English level.</b><br><br><b>NICE TO HAVES</b><br>- Knowledge of data libraries such as pandas or PySpark;<br>- Experience with columnar storage and time-series analytics tools such as ClickHouse;<br>- Familiarity with Microsoft Purview for data governance;<br>- Understanding of bitemporal data modeling concepts;<br>- Knowledge of financial reference data such as equities, fixed income, or corporate actions;<br>- Experience with CI/CD pipelines and automated deployments;<br>- Exposure to LLMs and Agentic AI for data-related use cases.<br><br><b>PERKS AND BENEFITS</b><br>- <b>Remote work & Local connection: </b>Work where you feel most productive and connect with your team in periodic meet-ups to strengthen your network and connect with other top experts.<br>- <b>Legal presence in India: </b>We ensure full local compliance with a structured, secure work environment tailored to Indian regulations.<br>- <b>Competitive Compensation in INR: </b> Fair compensation in INR  with dedicated budgets for your personal growth, education, and wellness.<br>- <b>Innovative Projects: </b>Leverage the latest tech and create cutting-edge solutions for world-recognized clients and the hottest startups.<br><br><br>

Back to blog