• Hadoop Administrator

    Location US-OH-Cincinnati
    Posted Date 3 weeks ago(6/25/2018 2:09 PM)
    Job ID
    # Positions
    Position Type
    Full Time
  • Overview

    Be Here. Be Great. Working for a leader in the insurance industry means opportunity for you. Great American Insurance Group’s member companies are subsidiaries of American Financial Group, a Fortune 500 company. We combine a "small company" culture where your ideas will be heard with "big company" expertise to help you succeed. With over 30 specialty property and casualty operations and a variety of financial services, there are always opportunities here to learn and grow.


    • Contribute to the design, implementation, and support multiple Cloudera Distributions of Apache Hadoop (CDH) in support of full-lifecycle application development, analytics, and data management.
    • Install, configure, monitor, tune, and troubleshoot all components of the CDH environments, including but not limited to, Cloudera Manager, Cloudera Management Services, HDFS, YARN, Zookeeper, Hive, Spark, Hue, Kudu, Impala, HBase, Key Management Server, Kafka, Flume, Solr, SSL, Sqoop, and Sentry.
    • Collaborates with other departments on requirements, design, standards, and architecture of applications.
    • Work directly with external vendors to resolve issues & perform technical tasks.
    • Provide Production Support during and after business hours, as needed.
    • Advise manager of platform risks, issues, and concerns.
    • Project and recommend periodic incremental and organic growth based on ongoing and forecasted workloads.
    • Contribute to the design, implementation, and support of software and hardware environments and configurations that yield non-functional requirements, such as security, accessibility, auditability, compliance, data retention, usability, and performance.
    • Work closely with infrastructure, network, database, business intelligence and application teams to ensure solutions meet requirements.
    • Provide subject matter expertise on the capabilities and use of cluster components.


    • 3-5 years’ experience with administration of Hortonworks and Cloudera (5.9.x & 5.10.x) distributions of Hadoop required.
    • 3+ years’ experience with OS administration around memory, CPU, system capacity planning, networks and troubleshooting skills.
    • Performing Backup and recovery using Cloudera BDR tool.
    • Experience working with RDBMS such as Oracle, SQL Server desired.
    • Experience with Hadoop Monitoring Tools such as Ambari, CDH CM.
    • Securing Hadoop clusters using Kerberos LDAP/AD integration and Encryption.
    • Effective oral, written, and interpersonal communication skills.
    • Demonstrated ability to establish priorities, organize and plan work to satisfy established timeframes.
    • Proven ability to handle multiple tasks and projects simultaneously.
    • Excellent problem-solving skills, core java application troubleshooting.


    Education: Bachelor’s Degree or equivalent experience.
    Field of Study: Computer Science, Information Technology or a related discipline.

    • Familiarity with: Python, Scala, Spark, R.
    • Working in relation with Data Warehouses and Data Marts a plus.
    • UC4 job scheduling knowledge is a plus.
    • Experience with scripting for automation and configuration management.


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed

    Connect With Us!

    Not ready to apply? Build your career profile here to connect with us!