Apply Now    

Hadoop Support Analyst - Intelligent Solutions

Req #: 170086797
Location: New York, NY, US
Job Category: Technology
Job Description:
JPMorgan Chase & Co. (NYSE: JPM) is a leading global financial services firm with assets of $2.6 trillion and operations worldwide. The firm is a leader in investment banking, financial services for consumers and small business, commercial banking, financial transaction processing, and asset management. A component of the Dow Jones Industrial Average, JPMorgan Chase & Co. serves millions of consumers in the United States and many of the world's most prominent corporate, institutional and government clients under its J.P. Morgan and Chase brands. Information about JPMorgan Chase & Co. is available at www.jpmorganchase.com.
JPMorgan Intelligent Solutions (JPMIS) transforms JPMC data assets to create and commercialize information and solutions that enable consumers, businesses and governments to make better decisions and achieve their objectives.
 
As a Hadoop Support Analyst you will be supporting multiple big data initiatives at JP Morgan Intelligent Solutions.  This role requires focus primarily on the operation of complex Hadoop systems managing content, optimum performance, and end user support.  This role requires strong collaboration and communication skills to interact with various IT and business groups.
 
Primary Responsibilities
  • Manage scalable Hadoop cluster environments. 
  • Optimize and tune the Hadoop environments to meet performance requirements.
  • Work with big data developers and developers designing scalable supportable infrastructure.
  • Work with Linux server admin team in administering the server hardware and operating system 
  • Assist with develop and maintain the system runbooks
  • Coordinate root cause analysis (RCA) efforts to minimize future system issues
  • Mentor, develop and train junior staff members as needed
  • Provide off hours support on a rotational basis

This position is anticipated to require the use of one or more High Security Access (HSA) systems. Users of these systems are subject to enhanced screening which includes both criminal and credit background checks, and/or other enhanced screening at the time of accepting the position and on an annual basis thereafter. The enhanced screening will need to be successfully completed prior to commencing employment or assignment. 

  • BS Degree in Computer Science/Engineering required
  • 5+ years of IT experience
  • 3-5 years overall experience in Linux systems
  • 2 years developing or supporting applications in a Hadoop environment
  • Well versed in managing distributions of Hadoop (Hortonworks, Cloudera etc.)
  • Expert level knowledge of Spark, HBASE, Kafka and services interoperability
  • Advanced experience of Hive, Impala, Sqoop and Flume
  • Knowledge in performance troubleshooting and tuning Hadoop Clusters
  • Good knowledge of Hadoop cluster connectivity and security
  • Familiar with Hadoop security knowledge and permissions schemes
  • Excellent customer service attitude, communication skills (written and verbal), and interpersonal skills
  • Experience working in cross-functional, multi-location teams
  • Excellent analytical and problem-solving skills
Apply Now    

Join our Talent Community

Not ready to apply? Leave your information with us and we will keep you up to date with new career opportunities.

Other Information

Apply Using LinkedIn

You can also apply using your LinkedIn® profile. It may save you some time because your information will be automatically transferred into our system. Just click on the LinkedIn logo when you get to the application screen and follow the directions.

Submit an Updated Résumé

During the application process, be sure you have an up-to-date copy of your Résumé, your cover letter and any other documentation you would like to submit.