Hadoop Big Data Administrator Positions with IRIS Software Noida Full Time

Iris Software Inc

7-12yrs

40 days ago

Noida

Primary Skills : cloudera, Administration, python, big data developer, Support, scala, Hadoop, Unix Shell Scripting, Mapreduce.

Secondary Skills : hortnworks, spark, hortonworks, linux, big data, hive pig, unix.

No. of Positions : 4

Salary Range : 8-14 lac

Job Description

We are looking for Big Data/Hadoop Administrator resources, below is the required job description-

 

Primary Skills:

 

  • Spark, Hive, Oozie, Sqoop, Hadoop
  • ETL,Map/Reduce
  • Sql Queries and any RDBMS experience will be add on..
  •  
  • Expertise in setting up fully distributed multi node Hadoop clusters, with Apache & Cloudera.
  • Administration of Hadoop Cluster and managing request for Performance management based on sample dataset available. Capacity planning of cluster from the available data set.
  • Strong Linux Administration to tune the nodes as per the behaviour of the application jobs that are running by end users.
  • Good knowledge in installing, configuring and using ecosystem components like Hadoop, MapReduce, Oozie, Hive, Sqoop, Pig, Flume, Zookeeper, Kafka, NameNode Recovery and HDFS High Availability using Cloudera Manager and Ambari.
  • Extensive experience on performing administration, configuration management, monitoring, debugging in Hadoop Clusters.
  • Good knowledge in Import/Export structured, un-structured data from various data sources such as RDBMS, Event logs, Message queues into HDFS, using a variety of tools such as Sqoop, Flume etc..
  • Hands on experience in resolving complex technical issues like recovery of nodes, Maintenance of Hadoop configuration files across the cluster nodes.
  • High availability, BAR and DR strategies and principles for Cloudera BDR Cluster.
  • Experience on setting up Cluster and configuring Multimode Hadoop Cluster on various Linux Platforms.
  • Should be able to integrate different Hadoop distributions such as CDH, HortonWorks, and Apache Hadoop etc.
  • Experience on CDH components like HDFS, Sqoop, Sqoop2, Pig, Hive, Zookeeper, Hbase, Oozie, Impala, Hue etc.
  • Experience on YARN, MapReduce (MRv1), YARN (MRv2) and Spark.
  • Experience on Cluster maintenance tasks such as Add, remove, and rebalance nodes in a cluster using cluster management tools like Cloudera Manager & Apache Hadoop.
  • Configuring High Availability (HA) using Cloudera Manager and High Availability for Other CDH Components.
  • Troubleshooting, Backup, Disaster Recovery, Performance tuning, upgradation and solving Hadoop issues.
  • Cluster Monitoring Using Events, Alerts and Configuring backups using Cloudera Manager.
  • Daily ticket analysis of open and critical operations issues.
  • Strong understanding of various schedulers in Hadoop like, FIFO, DRF, Fair and Capacity Schedulers.
  • Experience in Cluster Migration from Cloudera to Hortonworks or vice versa.
  • Shell Scripting to support infrastructure activities.
  • Experience in tools like Fabric to deploy the infrastructure configuration files across the cluster.
  • Upgradation of Clusters using Cloudera parcels is must. Cloudera upgrade through repos is optional
  • Timings of the Job : 12pm-9pm
  • Qualifications & Certifications: Bachelors Degree or higher level degree is strongly preferred.

 

 
If you are interested Please share your resume on aastha.kapoor@irissoftware.com with following details:
  • Total Exp :
  • Total Exp in hadoop Infra:
  • Total Exp in Linux:
  • Total EXp in Python and unix:
  • CTC:
  • ECTC:
  • NP:
  • Location
Job Description

We are looking for Big Data/Hadoop Administrator resources, below is the required job description-

 

Primary Skills:

 

  • Spark, Hive, Oozie, Sqoop, Hadoop
  • ETL,Map/Reduce
  • Sql Queries and any RDBMS experience will be add on..
  •  
  • Expertise in setting up fully distributed multi node Hadoop clusters, with Apache & Cloudera.
  • Administration of Hadoop Cluster and managing request for Performance management based on sample dataset available. Capacity planning of cluster from the available data set.
  • Strong Linux Administration to tune the nodes as per the behaviour of the application jobs that are running by end users.
  • Good knowledge in installing, configuring and using ecosystem components like Hadoop, MapReduce, Oozie, Hive, Sqoop, Pig, Flume, Zookeeper, Kafka, NameNode Recovery and HDFS High Availability using Cloudera Manager and Ambari.
  • Extensive experience on performing administration, configuration management, monitoring, debugging in Hadoop Clusters.
  • Good knowledge in Import/Export structured, un-structured data from various data sources such as RDBMS, Event logs, Message queues into HDFS, using a variety of tools such as Sqoop, Flume etc..
  • Hands on experience in resolving complex technical issues like recovery of nodes, Maintenance of Hadoop configuration files across the cluster nodes.
  • High availability, BAR and DR strategies and principles for Cloudera BDR Cluster.
  • Experience on setting up Cluster and configuring Multimode Hadoop Cluster on various Linux Platforms.
  • Should be able to integrate different Hadoop distributions such as CDH, HortonWorks, and Apache Hadoop etc.
  • Experience on CDH components like HDFS, Sqoop, Sqoop2, Pig, Hive, Zookeeper, Hbase, Oozie, Impala, Hue etc.
  • Experience on YARN, MapReduce (MRv1), YARN (MRv2) and Spark.
  • Experience on Cluster maintenance tasks such as Add, remove, and rebalance nodes in a cluster using cluster management tools like Cloudera Manager & Apache Hadoop.
  • Configuring High Availability (HA) using Cloudera Manager and High Availability for Other CDH Components.
  • Troubleshooting, Backup, Disaster Recovery, Performance tuning, upgradation and solving Hadoop issues.
  • Cluster Monitoring Using Events, Alerts and Configuring backups using Cloudera Manager.
  • Daily ticket analysis of open and critical operations issues.
  • Strong understanding of various schedulers in Hadoop like, FIFO, DRF, Fair and Capacity Schedulers.
  • Experience in Cluster Migration from Cloudera to Hortonworks or vice versa.
  • Shell Scripting to support infrastructure activities.
  • Experience in tools like Fabric to deploy the infrastructure configuration files across the cluster.
  • Upgradation of Clusters using Cloudera parcels is must. Cloudera upgrade through repos is optional
  • Timings of the Job : 12pm-9pm
  • Qualifications & Certifications: Bachelors Degree or higher level degree is strongly preferred.

 

 
If you are interested Please share your resume on aastha.kapoor@irissoftware.com with following details:
  • Total Exp :
  • Total Exp in hadoop Infra:
  • Total Exp in Linux:
  • Total EXp in Python and unix:
  • CTC:
  • ECTC:
  • NP:
  • Location

Relevant Skill Jobs

Noida

Pune( Baner )

Mumbai

Chennai

Ahmedabad

Latest Hadoop Big Data Administrator Positions with IRIS Software Noida Jobs