Job Description :
- Work within a scrum team(s) to deliver product stories according to priorities set by FCC and the Product Owners.
- Interact with stakeholders.
- Work with FCC's data pipeline to modernize legacy ETL jobs utilizing AWS technologies and DataVault 2.0.
- Relevant Experience: 5 years overall, 7 years preferred.
Skills & Experience :
- Proficient with Spark.
- AWS experience (RedShift, Glue, Step Functions, QuickSight). SAS experience (SAS Enterprise Guide, SAS Data Integration, SAS MIP) is a plus.
- Ability to communicate moderate to complex technical concepts to technical and non-technical personnel.
- Ability to conceptualize and articulate ideas clearly and concisely.
- Ability to design and implement functional, easy-to-understand code.
- Innovative problem-solver and critical thinker with a customer focus.
- Advocate for smart, clean, and maintainable code.
- Passion for technology, software, and data development.
- Knowledge of RDBMS.
- Experience designing and building data environments to support reporting and analytics, including data integrations and flow between disparate data systems.
- Experience with data modeling, data engineering, and/or data warehouse building.
Required Experience :
- Experience writing, troubleshooting, and optimizing AWS Glue jobs.
- Experience with SQL.
- Experience designing, implementing, and orchestrating data pipelines.
- Experience designing and implementing QuickSight reports/dashboards.
- Experience with Informatica and Teradata is preferred.
Nice to have :
- Experience with process orchestration tools like ActiveBatch.
- Data visualization experience.
Key Skills
Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala
Employment Type : Full-Time
Experience : 5+ years
Vacancy : 1