Big Data Engineer


Squadex is a fast growing technology consulting and engineering company headquartered in Silicon Valley with the large R&D center in Ukraine. We offer DevOps & Data Science expertise to emerging startups and well-established enterprises at the US market.

The Role

We are looking for a seasoned data/algorithm engineer who has experience building and maintaining large scale ETL’s and data warehousing to build the next generation mission critical data platform. If you are comfortable with swimming in Petabytes of data to define quality metrics, explore trending and to write scalable code for data pipelines, come talk to us.

In this role, you will

  • Responsible for designing, implementing, and maintaining LeadGenius’s big data pipeline infrastructure — including data ingestions, stream processing, data warehousing, data pipelines, visualization, quality metrics and exposing data to applications
  • Data modeling and metadata management
  • Ensure scalable, highly available, and robust big data platform architecture to meet service level agreements
  • End-to-end data processing, troubleshooting, problem diagnosis, performance benchmark, load balance, and code reviews

Your KPIs would be

  • Building large-scale infrastructural data systems using open-source technologies
  • Designing high-performance RESTful web services to expose data from SQL and NoSql
  • Cross team collaboration and effective communication

Skills & Experience

  • 7+ years of experience with object oriented programming language such as Java, and scripting languages such as Python
  • 5+ years of experience with big data and distributed NoSql technologies such as Spark, Hbase or Map-reduce
  • 3+ years of experience designing and implementing large, scalable distributed systems
  • 3+ years of production experience designing and implementing mission critical and metrics-driven ETL’s

Good to have

  • Experience with AKKA-based webservice is a plus
  • Strong in algorithms and CS fundamentals.

Apply For This Job