Software Engineer – Big Data Platform

Software Engineer – Big Data Platform Job Description Template

Our company is looking for a Software Engineer – Big Data Platform to join our team.

Responsibilities:

  • Contribute to detailed component design and development plan;
  • Oversee the collection and analysis of key metrics and reporting dashboards to monitor enterprise data platform performance and reliability;
  • Support the development and maintenance of data engineering guidelines, policies, standards and process narratives for in-scope business functions;
  • Participate and contribute to technology stack research and assessment as required;
  • Contribute to the design, development and delivery of the team’s Data Enablement Program and Data Intelligence Program roadmap;
  • Implement approved design following industry best practises and with high quality standard;
  • Interact with project owner, data engineering team lead and related internal customers for feature development and trouble-shooting.

Requirements:

  • Solid understanding and experience of any relational databases (Mysql, Postgres, SQLServer, etc);
  • Bachelor’s degree in Computer Science, Software Engineering, Computer Engineering, or other related degree;
  • Experience with Apache Kafka, Apache Airflow, Google Cloud Platform and Google BigQuery;
  • Excellent knowledge of Advanced SQL working with large data sets;
  • Experience developing and using virtualization, container-based and cloud platforms such as Kubernetes, Swarm, Docker, etc;
  • Practical experience in core Java/C#/Python with solid understanding of the Software Development Life Cycle;
  • Familiarity with various design patterns, good component and modeling ability;
  • Familiarity with mainstream big data related technologies such as distributed frameworks;
  • Familiarity with ETL technologies and tools for large scale data.