Big Data Administrator - (Prague/Brno/Ostrava/Remote, Python, Spark, Hadoop)

Big Data Administrator - (Prague/Brno/Ostrava/Remote, Python, Spark, Hadoop)
D-ploy (Czech Republic) s.r.o., Czech Republic

Experience
1 Year
Salary
0 - 0
Job Type
Job Shift
Job Category
Traveling
No
Career Level
Telecommute
Qualification
As mentioned in job details
Total Vacancies
1 Job
Posted on
Aug 31, 2021
Last Date
Sep 30, 2021
Location(s)

Job Description

D-ploy is an IT and Engineering Solutions company with operations throughout the EMEA region including Switzerland, Germany, Czech Republic, Austria, UK, as well as the USA.

We pride ourselves on delivering innovative and superior services and solutions to numerous industry-leading clients. By building relationships and trusted partnerships within the IT community, we optimize our customer‘s IT productivity and contribute to the organization’s success and value.

We are interested in talking to engaging, flexible, and solution-oriented individuals who are looking to become a part of a dynamically growing and international organization. We are focused on creating value where IT counts, join us!

Tasks and Responsibilities

  • Responsible for the whole lifecycle of big data product
  • Define architecture patterns
  • New components development
  • Analyze, develop, deploy and design solutions
  • Main technologies used: Python, Spark jobs, Hadoop, Hive, Impala and Kafka
  • Familiar with (optional): Scala, Bash, Java, JVM and SQL Experience
  • Hadoop cluster administration, performance tuning, and troubleshooting
  • Extend the functionality of clusters by adding new components, performance tuning, managing user access rights, and other tasks as needed
  • Work closely with other IT teams and application teams to make sure that all of the big data applications are highly available and performing as expected
  • Responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster

Requirements

  • Experience with administration of Hadoop clusters based on Cloudera distribution (ideally 5.x or 6.x)
  • Main technologies used: Python, Spark jobs, Hadoop, Hive, Impala and Kafka
  • Familiar with (optional): Scala, Bash, Java, JVM and SQL Experience
  • Knowledge of cluster monitoring tools
  • Good understanding of NoSQL databases such as HBase, and knowledge of Git/GitHub
  • At least basic knowledge of networking, CPU, memory and storage
  • Basic ability of shell scripting
  • Fluency in English (spoken and written)
  • Candidates have to declare Criminal record extract not older than three months

Benefits

  • Broad range of activities, tasks, and projects
  • Flexible working conditions
  • Minimum 5 weeks of vacation
  • Paid sick days
  • Meal vouchers
  • Vouchers (B-day voucher, wedding, and new born surprise)
  • Contributions to wellness programs (multisport card)
  • Fishing for Friends program our referral program
  • Refreshments in the D-ploy office
  • Further development and professional advancement
  • Friendly and international working environment
  • Company-sponsored events
  • Competitive salary and various benefits

Is
IT in your DNA?

Job Specification

Job Rewards and Benefits

D-ploy (Czech Republic) s.r.o.

Information Technology and Services - Prague, Czech Republic
© Copyright 2004-2024 Mustakbil.com All Right Reserved.