Posted 1 year ago

Job roles and responsibilities include:

  • Leading and managing a team of data engineers, providing guidance, mentoring, and ensuring that the team meets project goals and deadlines.
  • Designing and maintaining the data architecture for the organization, including data warehouses, data lakes, and other data storage solutions.
  • Building and maintaining data pipelines for extracting, transforming, and loading (ETL) data from various sources into the data storage solutions.
  • Integrating data from different sources, both internal and external, to create a unified and accessible data ecosystem.
  • Developing and managing data models, including data schemas, to support analytics, reporting, and data-driven decision-making.
  • Ensuring data quality and implementing data governance policies and practices to maintain data accuracy, consistency, and compliance with regulations.
  • Monitoring and optimizing data pipelines and systems for performance, scalability, and efficiency.
  • Implementing data security measures to protect sensitive data, including access controls, encryption, and compliance with data protection regulations.
  • Selecting, implementing, and maintaining data engineering tools and technologies that best fit the organization’s needs, such as ETL tools, data orchestration frameworks, and database systems.
  • Developing data transformation processes to prepare and clean data for analysis, ensuring it is suitable for reporting and other analytical purposes.
  • Creating and maintaining documentation for data pipelines, schemas, and processes to facilitate collaboration and knowledge sharing within the team and across the organization.
  • Collaborating with data scientists, analysts, and other stakeholders to understand their data requirements and ensure the data infrastructure meets those needs.
  • Investigating and resolving data-related issues and providing support to other teams in the organization when they encounter data problems.
  • Implementing monitoring and alerting systems to proactively identify and address data pipeline and infrastructure issues.
  • Estimating future data storage and processing requirements and planning for scalability and growth.
  • Have strong working experience on Data Lakehouse architecture.
  • Depth knowledge on SSIS ETL Tool and good working knowledge on Power BI
  • Should have worked on data sources such as SAP and Salesforce
  • knowledge of SSIS (ETL Tool), Azure Cloud, ADF, Azure Synapse Analytics & Azure Hub Events
  • Experience in working with product managers, project managers, business users, applications development team members, DBA teams and Data Governance team on a daily basis to analyze requirements, design, development and deployment technical solutions.

Requirements: Minimum bachelor’s degree in computer science or related with at least 18 years of experience. Send resume to: hr@bermtec.com. Travel and relocation to various unanticipated client locations throughout the United States may be required. Equal Opportunity Employer.

Apply Online

A valid email address is required.
A valid phone number is required.