Sr. Software Data Engineer

4 years

15.0 - 23.0 Lacs P.A.

Pune

Posted:2 weeks ago| Platform: Naukri logo

Apply Now

Skills Required

kubernetespublic cloudpythonsoftwaredatapysparkproblem solvingdata pipelinerelational databasesdata engineeringsqldata brickscloudjavadata modelingterraforminvestment managementsoftware engineeringobject oriented programmingdata visualizationaws

Work Mode

Work from Office

Job Type

Full Time

Job Description

The Role We are seeking an experienced Senior Software Data Engineer to join the Data Integrations Team, a critical component of the Addepar Platform team. The Addepar Platform is a comprehensive data fabric that provides a single source of truth for our product set, encompassing a centralized and self-describing repository, API driven data services, integration pipeline, analytics infrastructure, warehousing solutions, and operating tools. The Data Integrations team is responsible for the acquisition, conversion, cleansing, reconciliation, modeling, tooling, and infrastructure related to the integration of market and security master data from third-party data providers. This team plays a crucial role in our core business, enabling alignment across public and alternative investment data products and empowering clients to effectively manage their investment portfolios. As a Senior Software Data Engineer you will collaborate closely with product counterparts in an agile environment to drive business outcomes. Your responsibilities will include contributing to complex engineering projects using a modern and diverse technology stack, including PySpark, Python, AWS, Terraform, Java, Kubernetes and more. What You’ll Do Partner with multi-functional teams to design, develop, and deploy scalable data solutions that meet business requirements Build pipelines that support the ingestion, analysis, and enrichment of financial data by collaborating with business data analysts Advocate for standard methodologies, find opportunities for automation and optimizations in code and processes to increase the throughput and accuracy of data Develop and maintain efficient process controls and accurate metrics that improve data quality as well as increase operational efficiency Working in a fast-paced, dynamic environment to deliver high-quality results and drive continuous improvement Who You Are Minimum 5+ years of professional software data engineering experience A computer science degree or equivalent experience Proficiency with at least one object oriented programming language (Python OR Java) Proficiency with Pyspark,relational databases, SQL and data pipelines Rapid learner with strong problem solving skills Knowledge of financial concepts (e.g., stocks, bonds, etc.) is helpful but not necessary Experience in data modeling and visualisation is a plus Passion for the world of FinTech and solving previously intractable problems at the heart of investment management is a plus Experience with any public cloud is highly desired (AWS preferred). Experience with data-lake or data platforms like Databricks highly preferred. Important Note - This role requires working from our Pune office 3 days a week (Hybrid work model)

Financial Technology
San Francisco

RecommendedJobs for You

Kolkata, Hyderabad, Pune, Ahmedabad, Chennai, Bengaluru, Delhi / NCR, Mumbai (All Areas)