Apply Now    

Applications Developer-Big Data

Req #: 170088563
Location: Bangalore East, KA, IN
Job Category: Technology
Job Description:

J.P. Morgan is a leading global financial services firm, established over 200 years ago:

o We are the leader in investment banking, financial services for consumers and small businesses, commercial banking, financial transaction processing, and asset management.

o We have assets of $2.5 trillion and operations worldwide

o We operate in more than 100 markets.

o We have more than 243,000 employees globally.

Our wholesale businesses include J.P. Morgan’s Asset Management, Commercial Banking and the Corporate & Investment Bank which provide products and services to corporations, governments, municipalities, non-profits, institutions, financial intermediaries and high-net worth individuals and families.

Our corporate functions support the entire organization and include the following functions: Accounting, Audit, Finance, Human Resources, Operations, and Technology.

J.P. Morgan in India provides a comprehensive range of Corporate & Investment Banking, Commercial Banking, Asset & Wealth Management, and Corporate functions services and solutions to our clients, executing some of the most important financial transactions and providing essential strategic advice to our clients such as the government, large domestic and multi-national corporations, non-government organizations and financial institutions and investors. India is a key market for JPMorgan Chase globally and our employees in India are a critical part of how we do business globally and are integrated within our businesses. Our Global Service Centers (GSCs) are strategically positioned in Mumbai, Bangalore and Hyderabad to support the firm’s operations regionally and globally. The centers provide comprehensive strategic support across technology and business operations processing to all lines of business and the corporate functions.

                                                                                                                                                                                                                                                                                                                                          The Technology team at our GSCs service all Lines of Business and Enterprise Technology in helping build and operate innovative industry leading solutions. The breadth of capabilities within the Technology team at the GSC enables it to support the firm in leading edge areas such as Digital, Big data analytics, Robotics & Machine Learning.

 

As a BigData AWS Developer, you will plan, design, analyze, develop, code, test, debug and document programming to satisfy business requirements for large, complex projects. Candidate should be able to work effectively under pressure and meet tight deadlines

 

Key Responsibilities:

Designs Data Architectures.

•Design and develop Databases, Data Warehouses and Multidimensional Modeling and Databases.

•Define big data solutions that leverage value to the customer; understand customer use cases and workflows and translate them into engineering deliverables

•Design and build scalable infrastructure and platform to collect and process very large amounts of data (structured and unstructured), including streaming real-time data

•Lead and actively contribute to key data architecture "heavy" projects

•Analyze and evaluate applicable emerging technologies for adoption

•Proven experience of driving technology and architectural execution for enterprise grade solutions based on Big Data platforms.

•Knowledge of standard methodologies, concepts, best practices, and procedures within Big Data environment.

•Experience with designing data solutions based on Hadoop ecosystem.

•Familiarity with Data Visualization tools.

•Familiarity with Hortonworks, Cloudera, or MapR.

•Hands on exposure to infrastructure as a service providers such as: Google Compute Engine, Microsoft Azure or Amazon AWS.

 

Experience with:

•HBASE, Hive, Impala, Presto

•Apache Kafka, Spark, Storm, Flume.

•Kerberos, Transparent Encryption Zones and network security.

•Real time stream processing.

•Java and its concurrency packages

•Strong knowledge in RESTful Web Services

•Strong scripting skills in Python / Shell Scripting.

 

 

Experience with:

•HBASE, Hive, Impala, Presto

•Apache Kafka, Spark, Storm, Flume.

•Kerberos, Transparent Encryption Zones and network security.

•Real time stream processing.

•Java and its concurrency packages

•Strong knowledge in RESTful Web Services

•Strong scripting skills in Python / Shell Scripting.

Apply Now    

Join our Talent Community

Not ready to apply? Leave your information with us and we will keep you up to date with new career opportunities.

Other Information

Apply Using LinkedIn

You can also apply using your LinkedIn® profile. It may save you some time because your information will be automatically transferred into our system. Just click on the LinkedIn logo when you get to the application screen and follow the directions.

Submit an Updated Résumé

During the application process, be sure you have an up-to-date copy of your Résumé, your cover letter and any other documentation you would like to submit.