Where you fit inShell's Projects and Technology business develops advanced products and technologies to meet customer demand, grow the LNG, Gas, and Power businesses, integrate Manufacturing, Chemicals and Trading, and maximize the business competitiveness. As a Senior Data Engineer, your role is to be a subject matter expert and mentor young professionals in creating data-driven thinking within the organization. You will convert Vision and Data Strategy for IT solutions and deliver them while communicating with both technical developers and business managers. With your knowledge you will
What's the role?- Actively deliver the roll-out and embedding of Data Foundation initiatives in support of the key business programs advising on the technology and using leading market standard tools.
- Ensure traceability of requirements from Data through testing and scope changes, to training and transition.
- Drive implementation efficiency and effectiveness across the pilots and future projects to minimize cost, increase speed of implementation and maximize value delivery.
- Coordinate the change management process, incident management and problem management process.
- Work with source control technologies (such as GITHUB, Azure DevOps)
- Help create data-driven thinking within the organization, not just within IT teams, but also in the wider Shell stakeholder community.
- Communicate skillfully with both technical developers, architects, and business stake holders.
What we need from you- At least 7 years of experience in the IT industry
- Degree in a quantitative field such as Computer Science, Statistics, Informatics, or Information Systems
- Experience working with relational and Data Warehouse databases, SQL, and NoSQL databases
- Familiarity with workflow management tools such as Azkaban, Luigi, and Airflow
- Strong query optimization skills, ability to manipulate, process, and extract value from large, disconnected datasets.
- Experience in working with data engineering technologies AWS cloud services: EC2, EMR, RDS, Redshift. Your experience in handling big data sets and big data technologies will be an asset
- In-depth knowledge of at least one scripting language such as Python, SQL, or PySpark
- Proficient in core Python skills such as Numpy, Panda, and Django
- Understanding of Data Foundation initiatives such as Modelling, Data Quality Management, Data Governance, Data Maturity Assessments, and Data Strategy.
- Experience in working with AGILE, KANBAN methodologies
- Desirable skills for the job role include experience in creating or enhancing CI/CD build and releases pipelines, proficiency in scripting languages like YAML, PowerShell, Terraform
- Experience with big data tools such as Kafka, Hadoop, Spark, and stream-processing systems, as well as certifications in AWS is desirable.