About the job SR Data Engineer - Snowflake
About Pearster
We are a USA-based global IT company offering Team Extension, Managed Services and Performance Squads to clients in the USA, Canada, and Europe. With a global team, we connect talent with top opportunities while creating a personalized and enriching experience for our staff.
We are looking for a savvy Senior Data Engineer to join the Data Science & Management team in a leading Canadian company. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
The ideal candidate is an experienced data pipeline builder and data modeler who enjoys optimizing data systems and building them from the ground up. The Senior Data Engineer will support our product owners, product analysts, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing work. They must be comfortable supporting the data needs of multiple teams, systems and products and work in an agile minded team.
The right candidate will be excited by the prospect of designing, developing, and maintaining our companys data architecture to support our next generation of products and data initiatives.
The project consists in the migration of approximately 1,000 pipelines to Snowflake. The ideal candidate should have experience with Active Batch, Talend, Snowflake, Docker, and GitHub.
Key Responsibilities:
- Requires the application of theoretical, domain/technology-specific knowledge, typically gained through formal education, or relevant expertise within their professional area(s) (i.e., engineering, software design, systems architecture, etc.) to achieve results.
- May be required to guide and influence others. However, the primary focus of the job is through application of technical/domain expertise.
- Has wide-ranging experience, uses professional concepts and company objectives to resolve complex problems.
- Engage with business stakeholders to establish clear needs and links to solutions, including setting up prototypes, and involving multiple parties in design sessions.
- Exercises judgment in selecting methods and evaluation criteria to obtain results.
- Creates data collection, extraction, and transformation frameworks for structured and unstructured data.
- Develops and maintains data infrastructure systems (e.g. data warehouses) including data access points.
- Prepares and manipulates data using a variety of data integration and data pipeline tools including, but not limited to, Talend, SQL, Snowflake, Kafka, Azure Data Factory.
- Create efficient load processes including logging and handling of exceptions, notification to support, and overall visibility operations.
- Organizes data into formats and structures that optimize reuse and efficient delivery to businesses and analytics teams and system applications.
- Integrates data across data lake, data warehouse and systems applications to ensure the consistent delivery of information across the enterprise.Accountable for efficient architecture and systems design.
- Builds and evolves the data service layer and engages the team to bring together components for a best-in-class customer offering. Highly skilled in assessing overall data architecture and integrations and making ongoing improvements to the solution offering.
- Lead the architecture, design and implementation of complex data architecture and integration solutions including best practices for the full development life cycle, coding standards, code reviews, source control management, build processes, testing, and operations.
- Collaborates with data governance and strategy to ensure data lineage is well understood and constructed in a way to highlight data reuse and simplicity.
- Actively works to assess new opportunities to simplify the data operation with new tools, technologies, file storage, management, and process. Uses team context and experience to evaluate these opportunities and bring them forward to team members for assessment and implementation.
Requirements:
- Bachelors degree required, Masters an asset, in Software Engineering, Computer Science, or equivalent work experience in a Technology or business environment
- Minimum of 7 years of experience working in developing and following structured work processes in data engineering.
- Minimum of 7 years of experience in integration solutions development with data integration tools (e.g., Talend, Azure Data Factory).
- Minimum of 3 years of experience with real-time data streaming tools like Kafka, or any similar tools.
- Experience with cloud platforms including Azure, AWS, or GCP.
- Experience with data warehousing platforms such as Snowflake or Databricks.
- High proficiency in SQL, Python, and other programming languages.
- Highly proficient in data management, governance, data design and database architecture. Proven track record of manipulating, processing, and extracting value from large, disconnected datasets.
- Proficient in data modelling and data architecture with experience using WhereScape RED/3D an asset.
Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Highly proficient in data modelling, data integrations, data orchestration, and supporting methodologies.
- Highly proficient in leading large-scale projects or significant project steps and communicating progress/approach with technical/non-technical peers/clients and leaders.
- Highly proficient in multiple programming languages and coding. Excellent ability to design and engineer moderately complex enterprise solutions.
- Working knowledge of message queuing, stream processing, and highly scalable big data data stores.
Strong project management and organizational skills.
Benefits:
- Fully remote work arrangement as a contractor
- Competitive salary in USD
- 10 paid time off (PTO) days per year
- 100% company-covered international certifications
- Access to coworking spaces
- English classes
- Engaging team-building activities
- Personalized gifts
- Welcome kit
- Referral program