Company: Baker Hughes
Skills: IT - Programming & Database
Experience: 5 + Years
Education: Bachelors/3-5 yr Degree
Location: Bengaluru, Karnataka, India


Job Requirements

We are looking for a motivated and an experienced data engineer to join a growing data science team at Baker Hughes. As a senior data engineer, you will be required to work closely with customers, data scientists and solution architects to build robust production ready data infrastructure and data pipelines to help scale the machine learning and analytics solutions. In this role, you will drive the development of data engineering solutions from initial experimentation to production level deployment. You will also work with data science leadership to develop internal tools for rapid data ingestion and integration of customer data into selected cloud platforms and other SaaS solutions.

Responsibilities

· Collaborate and work in a global data science team to develop scalable and robust data integration infrastructure.

· Engage directly with customers and partners to design and develop data requirements based on functional requirements

· Build custom data integration pipelines from existing source systems into cloud platforms such as AWS, Microsoft Azure etc.

· Enable data ingestion, pre-processing, custom data wrangling from filesystems, databases, queues and streams to enable rapid prototyping.

· Work with customers to develop custom data handlers and connectors as needed

· Perform a variety of data loads & data transformations

· Improve database and application performance with fine tuning

· Work with other project teams for data integrations and data lake requirements

· Automate processes for better stability and performance of application

Work Experience

Qualifications/Requirements

· BS or MS degree in Computer Science, Engineering, IT.

· Minimum 5+ years of professional experience in data engineering, databases and and/or business analytics.

· Experience reading and parsing industrial P&ID and PFD documents.

· Solid background and experience in SQL, Python and/or Java/Scala

· Minimum 3 years' experience in ETL and data pipeline design for heterogenous data such as time series, streams, queues and text.

· Familiarity with containers and container services like Docker, Docker registry and Kubernetes,

· Knowledge of test-driven development, agile software development methodologies and tools

· Excellent verbal and written communication skills to work in a global team and with customers on a regular basis.