Job Description

Job Title: Data Engineer/Modeler
Job Number: 22164
Type of Employment: Permanent Role

Overture Partners client has an exciting opportunity for a Data Engineer/Modeler to work with an industry leader in a fully remote role. The ideal candidate must have excellent communication and collaboration skills.

Opportunity Overview:
  • Advocate for well founded data management practices to foster a better understanding of data and analytics.
  • Responsible for definition, development and execution of strategies to implement and enhance data ecosystem.
  • Interact with stakeholders and subject matter experts to determine project requirements that align with business goals
  • Partner with colleagues and relevant subject matter experts on analytical solutions and models to ensure optimization of data quality, security to deploy them production
  • Prioritize, scope and manage data engineering projects.
  • Discover, understand, and integrate new internal and external data sources, structures, and process pipelines.
  • Drive and participate in all tasks associated with design, development, implementation, and support of ETL/ELT processes
  • Implement data structures on a variety of databases and platforms, including data lakes, data warehouses, online caches and near-time systems, SQL Server, Oracle, Teradata, Redshift, MongoDB, DynamoDB, and Snowflake.
  • Perform various data modeling methodologies like relational 3rd normal form, star schema, snowflake schema, schema-on-read, data vault, etc. to design and implement data models.
  • Architect, implement, optimize and maintain reliable data pipelines from source acquisition to integration and consumption while guaranteeing compliance with data governance and data security requirements.
  • Use cutting edge tools and methodologies/architectures to automate repeatable tasks to improve productive and limit errors

Qualifications & Required Skills
  • Proven experience with data management disciplines.
  • Experience with  data integration, data modeling, optimization and data quality that is commensurate with job level.
  • Working knowledge of data discovery, analytics, visualization and business intelligence tools.
  • Strong experience with scripting languages such as Python, PowerShell, and Linux.
  • Understanding of at least one formal programming language Java, or C++.
  • Basic understanding of ETL/ELT platforms such a SSIS, DataStage or Informatica.
  • Thorough knowledge of relational and non-relational database theory and structured query language (SQL).
  • Ability to design, build and optimize data pipelines, ETL processes, and data structures.
  • Demonstrated success integrating large, heterogeneous datasets and automating repetitive data preparation tasks.
  • Skilled working across multiple deployment environments including cloud, on-premises and hybrid.
  • Adept applying software engineering methodologies and DevOps principles to data pipelines to improve the communication, integration, reuse and automation of data flows across the organization.
Educational Requirements:

Bachelor’s degree in data management, computer science, information systems, mathematics, data/analytics or a related field or equivalent work experience is required.




 

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Apply Online