Overview
AWS Databricks - Senior - Job Description
Responsibilities
- Expertise in Data warehousing and ETL Design and implementation
- Hands on experience with Programming language like Python, PySpark/Scala
- Good understanding of Spark architecture along with internals
- Expert in working with Databricks on AWS
- Hands on experience using AWS services like Glue(Pyspark), Lambda, S3, Athena, RDS, IAM, Lake formation
- Hands on experience on implementing different loading strategies like SCD1 and SCD2, Table/ partition refresh, insert update, Swap Partitions
- Experience in consuming and writing data from and to Flat files, RDMBS systems, MPPS, JSON and XML, Service, Streams, queues, CDC etc. Awareness of scheduling and orchestration tools
- Awareness of OS Compute, Networking, internal working and architecture of DB and ETL Server and its impact on ETL
- Experience on RDBMS systems and concepts
- Expertise in writing and complex SQL queries and developing Database components including creating views, stored procedures, triggers etc.
- Create test cases and perform unit testing of ETL Jobs
- Analytical mind with a problem-solving and debugging skills
- Excellent Communication Skills
- Awareness of Replication, Synchronization and Disaster Management techniques
- Hand on experience in Data Quality Management
- Awareness of data governance concepts and implementation
Experience
Experience 3-7 Years of experience.