|Job Type:||Full Time|
JPMorgan has undertaken an aggressive digital transformation strategy, and is investing in innovative ways to attract clients, deepen engagement and drive increased satisfaction through delightful interactions with digital products and experiences. We strive to transform client experiences, simplify the ways we do business and tirelessly drive toward product excellence. Our team is at the heart of driving this transformation, focused on developing innovative offerings that put the client at the center.
Culture is as important to us and we are looking for intellectually curious and honest, passionate, hungry individuals who would like to expand their skills whilst working on a new exciting venture for the firm. Your work will have a massive impact, both on us as a company, as well as our clients and our business partners around the world.
We are seeking experienced Software Engineers to take part in an exciting initiative to help build a new Data and Analytics platform from the ground up, participating in the design, development and execution of solutions end to end. To meet the challenge, you will have the opportunity to work with Core Java, stream processing, big data technology.
Work will include all aspects of software development lifecycle, including exposure to middleware messaging technology and multi-tiered environments. You will also support and participate in system and integrated testing across sub-systems.
Specific responsibilities will include, but is not limited to:
Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API
Design, implement and support various custom data processing (realtime and batch) solutions using open source components
Communicate with project stakeholders (gathering requirements, giving regular updates, etc.)
Strong experience with server side programming, language agnostic.
Experience working with open source frameworks.
Experience with scripting and working in a UNIX environment.
Working experience with Kafka from development and operations perspective
Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
Experience with Apache Spark, Flink, Apache Beam
Ensure optimum performance, high availability and stability of solutions
Use automation tools like maven, provisioning using Docker, Jenkins
Ability to perform data related benchmarking, performance analysis and tuning.
Strong skills in In-memory applications, Database Design, Data Integration.
Experience with micro services, CQRS
Knowledge of NoSQL databases and Kafka messaging systems.
Strong Java background with Spring, Spring boot
Experience with python
Experience with Kubernetes
Experience with AWS, Azure or any cloud provider
Excellent communication skills in English (both written and spoken forms).
Comfortable in more than one programming language and have a firm grasp of fundamental web/internet technologies
Being a self-starter