Skip to main content
搜索

搜索职位

Senior Data Engineer

India, Remote Additional Locations: India, Hyderabad 职位编号 R0000036273 类别 IT
立即申请

关于这个职位

Key Responsibilities

  • Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.

  • Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting.

  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders.

  • Excellent grasp of and expertise with test-driven development and continuous integration processes.

  • Analysis and Design – Converts high-level design to low-level design and implements it.

  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans.

  • Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle.

  • Benchmark application code proactively to prevent performance and scalability concerns.

  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management.

  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments.

  • Familiarity with PowerBI and Reltio is advantageous but not required.

  • Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.

  • Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.

  • Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.

  • Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.

  • Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.

Skills:

  • Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.

  • Understanding of quality processes and estimate methods.

  • Understanding of design concepts and architectural basics.

  • Fundamental grasp of the project domain.

  • The ability to transform functional and nonfunctional needs into system requirements.

  • The ability to develop and code complicated applications is required.

  • The ability to create test cases and scenarios based on specifications.

  • Solid knowledge of SDLC and agile techniques.

  • Knowledge of current technology and trends.

  • Logical thinking and problem-solving abilities, as well as the capacity to collaborate.

  • Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO.

  • Sought: SQL, Python, PowerBI.

  • General Knowledge: PowerApps, Java/Spark, Reltio.

  • 5-7 years of experience in software development with minimum 3 years of cloud computing.

  • Proficient in SQL, Python, and cloud-native architecture.

  • Strong grasp of data security, privacy compliance, and best practices in a regulated environment.

Education:

  • Bachelor's Degree in technical discipline (Math’s, Science, Engineering, Computing, etc.)

立即申请

推荐职位

There is no jobs to view

当前无收藏任何职位。

 

请确保接受网页的cookies,当您再次访问时,即可在此查看您收藏的职位。

 

查看职位

进一步认识精鼎

0
返回页面顶端