Job Openings
Data Architect
About the job Data Architect
Job Summary:
We are looking for a seasoned Data Architect to lead the design, development, and implementation of enterprise data architecture solutions. This role will work closely with business, technology, and data teams to define data strategy, modernize data platforms, and ensure scalable, secure, and high-performing data ecosystems.
The ideal candidate will have deep experience in data modeling, cloud-based data warehousing, and data governance, and will play a key role in enabling data-driven decision-making across the organization.
Key Responsibilities:
- Design and implement end-to-end data architecture, including data models, integration frameworks, metadata structures, and data pipelines.
- Define and enforce standards for data modeling, quality, security, and lifecycle management.
- Partner with stakeholders to align data architecture with business needs, reporting requirements, and digital transformation goals.
- Evaluate and select data technologies and platforms to support cloud, hybrid, and on-premise data solutions.
- Lead the development of data lake/data warehouse architectures and support advanced analytics, BI, and machine learning initiatives.
- Collaborate with engineering teams to design data APIs, pipelines, and ETL/ELT workflows.
- Ensure data architecture supports scalability, data privacy, regulatory compliance, and operational performance.
- Contribute to the development of enterprise-wide data governance, master data management (MDM), and data cataloging efforts.
Qualifications:
- Bachelors or Masters degree in Computer Science, Information Systems, Data Engineering, or a related field.
- 8+ years of experience in data architecture, data engineering, or related roles, with increasing levels of responsibility.
- Strong knowledge of data modeling (conceptual, logical, physical), relational and non-relational databases.
- Hands-on experience with cloud-based data platforms (e.g., Azure Data Services, AWS Redshift, Google BigQuery, Snowflake).
- Familiarity with modern data stack tools such as dbt, Apache Spark, Kafka, Airflow, or equivalent.
- Deep understanding of data governance, security, and regulatory considerations (e.g., GDPR, HIPAA, SOC2).
- Experience in designing scalable and secure data architectures that support real-time and batch processing.
- Excellent problem-solving and communication skills; ability to bridge technical and business perspectives.