Enable job alerts via email!

16 years Big Data Architect IT Consultant with Databricks

AHU Technologies Inc

Washington (District of Columbia)

On-site

USD 90,000 - 150,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a dynamic team as a Big Data Architect IT Consultant, where you'll leverage your expertise in developing cutting-edge data systems for the District of Columbia. This role offers a unique opportunity to work on innovative IoT and Smart City projects, collaborating with a senior data team to implement robust data architectures using tools like Databricks and Microsoft Azure. Your contributions will directly impact the efficiency of enterprise data operations and enhance the data-driven decision-making processes within the government. If you're passionate about big data and want to make a significant difference in a collaborative environment, this position is perfect for you!

Qualifications

  • Bachelor's degree in IT or equivalent experience required.
  • 16+ years of experience in project management and compliance.

Responsibilities

  • Coordinate IT project management, engineering, and maintenance.
  • Develop and enforce standards for system design and documentation.
  • Provide training and perform application upgrades.

Skills

Big Data storage and analytics platforms
Data Architecture and Implementation best practices
Networking, security, and storage on cloud platforms
Deployment of data tools on cloud platforms
Data-centric systems for analysis and visualization
Querying structured and unstructured data sources
Data modeling and ingestion
Apache data products (Spark, Sedona, Airflow, etc.)
API / Web Services (REST/SOAP)
Complex event processing and real-time streaming data
Deployment and management of data science tools
ETL, data processing, analytics (Python, Java, R)
Cloudera Data Platform
Project management and coordination

Education

Bachelor’s degree in Information Technology
Equivalent experience

Tools

Databricks
Microsoft Azure
Apache Hadoop
Tableau
MicroStrategy
Esri
Oracle databases
Streamsets
Apache NiFi
Azure Data Factory

Job description

Role: Big Data Architect IT Consultant Master

Client: State of DC

Location: Washington, D.C.


Job Description

This role will provide expertise to support the development of a Big Data / Data Lake system architecture that supports enterprise data operations for the District of Columbia government, including the Internet of Things (IoT) / Smart City projects, enterprise data warehouse, the open data portal, and data science applications. This is an exciting opportunity to work as a part of a collaborative senior data team supporting DC's Chief Data Officer. This architecture includes Databricks, Microsoft Azure platform tools (including Data Lake, Synapse), Apache platform tools (including Hadoop, Hive, Impala, Spark, Sedona, Airflow) and data pipeline/ETL development tools (including Streamsets, Apache NiFi, Azure Data Factory). The platform will be designed for District-wide use and integration with other OCTO Enterprise Data tools such as Esri, Tableau, MicroStrategy, API Gateways, and Oracle databases and integration tools.


Responsibilities:
  1. Coordinates IT project management, engineering, maintenance, QA, and risk management.
  2. Plans, coordinates, and monitors project activities.
  3. Develops technical applications to support users.
  4. Develops, implements, maintains and enforces documented standards and procedures for the design, development, installation, modification, and documentation of assigned systems.
  5. Provides training for system products and procedures.
  6. Performs application upgrades.
  7. Performs monitoring, maintenance, or reporting on real-time databases, real-time network and serial data communications, and real-time graphics and logic applications.
  8. Troubleshoots problems.
  9. Ensures project life-cycle is in compliance with District standards and procedures.

Minimum Education/Certification Requirements:

Bachelor’s degree in Information Technology or related field or equivalent experience


Skills
  1. Experience implementing Big Data storage and analytics platforms such as Databricks and Data Lakes.
  2. Knowledge of Big Data and Data Architecture and Implementation best practices - 5 Years.
  3. Knowledge of architecture and implementation of networking, security and storage on cloud platforms such as Microsoft Azure - 5 Years.
  4. Experience with deployment of data tools and storage on cloud platforms such as Microsoft Azure - 5 Years.
  5. Knowledge of Data-centric systems for the analysis and visualization of data, such as Tableau, MicroStrategy, ArcGIS, Kibana, Oracle - 10 Years.
  6. Experience querying structured and unstructured data sources including SQL and NoSQL databases - 5 Years.
  7. Experience modeling and ingesting data into and between various data systems through the use of Data Pipelines - 5 Years.
  8. Experience with implementing Apache data products such as Spark, Sedona, Airflow, Atlas, NiFi, Hive, Impala - 5 Years.
  9. Experience with API / Web Services (REST/SOAP) - 3 Years.
  10. Experience with complex event processing and real-time streaming data - 3 Years.
  11. Experience with deployment and management of data science tools and modules such as JupyterHub - 3 Years.
  12. Experience with ETL, data processing, analytics using languages such as Python, Java or R - 3 Years.
  13. Experience with Cloudera Data Platform - 3 Years.
  14. 16+ yrs planning, coordinating, and monitoring project activities - 16 Years.
  15. 16+ yrs leading projects, ensuring they are in compliance with established standards/procedures - 16 Years.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

16 years Big Data Architect IT Consultant with Databricks

ZipRecruiter

Washington

On-site

USD 120,000 - 170,000

8 days ago