A financial firm is looking for a Data Engineer - Databricks to join their team. This role is remote.
Pay: $100-130/hr
US Citizens or GC Holders Only
No c2c - W2 Only
Qualifications:
- A Bachelor's Degree in Computer Science, Management Information Systems, Computer Engineering, or related field.
- 7+ years of experience in designing and building large-scale solutions in an enterprise setting.
- 3+ years designing and building solutions in the cloud.
- Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB or analogous architectures.
- Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures.
- Deep SQL expertise, data modeling, and experience with data governance in relational databases.
- Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies.
- Refined skills using one or more scripting languages (e.g., Python, bash, etc.).
- Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind.
- Expertise in relational and dimensional data modeling.
- UNIX admin and general server administration experience.
- Experience with leveraging CI/CD pipelines.
- Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience is a plus.
- Experience using Spark, Kafka, Hadoop, or similar distributed data technologies is a plus.
- Experience with Agile methodologies and able to work in an Agile manner is preferred.
- Experience using ETL/ELT tools and technologies such as Talend, Informatica is a plus.
- Expertise in Cloud Data Warehouses in Redshift, BigQuery or analogous architectures is a plus.