You will contribute to projects that automate Health, Safety, and Environment (HSE), Environmental, Social, and Governance (ESG), and operational processes using AI, Computer Vision, and IoT sensory data. These solutions help industries like oil and gas transition to clean energy and optimize operations through actionable insights.
Technical Stack:
- Languages & Frameworks: Python, FastAPI
- Data Pipelines: Apache Airflow, ETL/ELT processes, Kafka
- Databases: AWS Aurora, AWS PostgreSQL, AWS TimestreamDB
- Tools & Platforms: Docker, Bitbucket, AWS EKS
- Infrastructure: Terraform
Responsibilities:
- Design, develop, and maintain scalable data pipelines using Python and Apache Airflow.
- Implement ETL and ELT processes to handle streaming IoT and ML data.
- Manage and optimize relational and non-relational databases, ensuring performance and reliability.
- Develop REST APIs and work with webhooks for real-time data updates.
- Perform data processing, aggregation, and analysis to generate actionable insights.
- Collaborate with cross-functional teams to align data workflows with project requirements.
- Optimize workflows through multithreading and performance tuning.
Requirements:
- 5+ years of experience as a Data Engineer or in a similar role.
- Strong expertise in Python
- Expertise with ETL schedulers such as Apache Airflow, Luigi, Oozie, AWS Glue or similar frameworks
- Solid understanding of data warehousing concepts and hands-on experience with relational databases (e.g., PostgreSQL, Aurora) and columnar databases (e.g., Redshift, , ClickHouse)
- Proficient in SQL.
- Hands-on experience with message streaming platforms like Kafka.
- Experience with Terraform or similar infrastructure-as-code tools.
- Deep understanding of ETL/ELT processes and data pipeline orchestration.
- Familiarity with REST APIs and real-time data integration via webhooks.
- Knowledge of multithreading for performance optimization.
- Excellent analytical, problem-solving, and communication skills.
- Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions
Nice-to-Have Skills:
- Previous experience in startup environments or working on IoT/AI projects.
- Experience in oil and gas domain
- A degree in Computer Science, Engineering, or a related field.
- Experience with Websockets
- Experience of team management
What We Offer:
- Professional Growth: Work with cutting-edge technologies and complex challenges.
- Innovative Environment: Be part of projects that drive transformation in industries like energy and safety.
- Global Collaboration: Work with team members from Ukraine, Poland, the USA, and beyond.
- Flexibility: Choose between working remotely or from our offices.
- Supportive Culture: Join a dynamic team that values collaboration, innovation, and continuous improvement.
If you’re a talented Senior Data Engineer looking to make an impact with advanced AI and IoT projects, we’d love to hear from you! Apply now!