Senior Software Engineer
Senior Software Engineer -Kafka / Java
A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive. With a dedicated team of over 80,000, operating in 130 countries; we go all the way to enable global trade for a growing world. From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other.
The Maersk group operates one of the most comprehensive marine container terminals networks in the world, under the brand of APM Terminals, with over 20,000 employees operating 76 terminals in 36 countries and growing.
APM Terminals (APMT) plays a critical role in enabling the global Maersk Ocean and Logistics network as well as serving other Shipping Lines, Beneficial Cargo Owners, Freight Forwarders, intermodal logistics providers and integrating with a wide variety of local port authorities, customs and government bodies across our global network of transshipment, import and export gateway terminals.
In support of the broader Maersk strategy, we are accelerating the transformation of our business from a portfolio of independently operated terminals, into a safer, better, bigger global terminals operator.
We are currently looking for a Senior Software Engineer to join our APM Terminals Integration Platform Portfolio team - based in Bangalore (India)
• Building software in accordance with the Maersk standards and guidelines
• Responsible for the quality design and implementation (supportable, maintainable, scalable, performant, secure, efficient, and supportable) of data driven applications delivering business value
• Provide design validation on key technologies
• Building Data Products in the form of APIs and Derived events in a reusable method
• Using disparate data sources to deliver complex event driven products with a good understanding of the business through partnering with them.
• Designing and engineering Data Products that are lean focused, end to end, thinking with a value orientated mindset and considering FinOps, Lean and true MVPs when delivering products
• Supporting onboarding data from multiple terminals with different cultures and differing levels of technical debt
• Take part in on-call rotations with the platform consumers (product development teams) and take the lead in preventing incidents and maintaining platform SLAs, through automation and blameless postmortems.
• Ensuring builds are kept green and the code management strategy (branching) is closely followed.
• Raising capability and standards within team; pairing on tasks, peer review of team members’ code and constructive feedback for improvement in both the code base and team capability (blame free feedback etc.)
• Proactive contribution to continual improvements within your team through both active participation in retrospective and from engagement with cross team best-practice communities
• Advisory to Product Owners to identify and manage risks, debt, issues, and opportunities for technical improvements
• Supporting the recruitment of (engineers) across the department
• Technical support during cut-over activities
Who we are looking for:
We are looking for candidates with a proven performance track record with the following:
- Minimum 4+ years of relevant experience in Kafka.
- Minimum 6+ years in Java (preferably 11)
- Strong API Development from design to inception
- Working with GitHubActions pipelines to deliver automated CI/CD pipelines.
- Code repositories and development lifecycle best practices with tools such as GitHub
- You have an innovative, can-do attitude, you explore the best of what is out there – from CNCF projects to Open Source, with a focus on accelerating deliver to your customers
- You are practiced in Agile, TDD, BDD methodologies and can work within and advise scrum teams
- A secure by design attitude – from TLS certificates, OAuth to industry CIS standards – you know how to engineer in security from the start, and then stay secure
- You can advise other engineers on best practices within a cloud-native world
- Using: Kafka, RESTful API’s, PostgreSQL, ArgoCD, Prometheus, Grafana, CEPH, Thanos – you thrive with the technologies of a cloud-native world
- Working with OpenSource streaming technologies such as Debezium and Kafka.
- Using enterprise scale patterns: 12-factor applications, CQRS, eventual consistency, multi-region etc.
- Develop and implement solutions using Kafka.
- Administer and improve use of Kafka across the organization including Kafka Connect, Kafka Streams, and custom implementations.
- Work with multiple teams to ensure best use of Kafka and data-safe event streaming.
- Understand and apply event-driven architecture patterns and Kafka best practices and enable other development teams members to do the same.
- Strong knowledge and experience with Kafka Streams API, Kafka Connect, Kafka brokers, zookeepers, API frameworks, Pub / Sub patterns, schema registry, KSQL, Rest proxy, Replicator, ADB, Operator and Kafka Control center.
- Hands on experience on Kafka connectors such as MQ connectors, Elastic search connectors, JDBC connectors, File stream connector.
- Working knowledge and expert level understanding of data migration and CDC as it relates to Kafka using Kafka Connect and Debezium (preferable)
- Knowledge of source and sink connector technical details for a variety of platforms including PostgreSQL, MS SQL Server, Oracle and others as required.
- Understand various Kafka related metrics, read dashboard and support to ensure no downtime.
- Experience with micro-service application architecture, developed Java Spring Boot container applications
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Experience in building Kafka producer and consumer applications using Spring Boot
Must Have Skills
- Adhering to best SecDevOps practices
- Working in a manner that drives cost efficiency (FinOps), through removal of waste.
- Driving re-usability through well designed data products
- Listening to the needs of technical and business stakeholders and interpreting them in partnership with the product owner
- Identifying areas of innovation in data tools and techniques and recognize appropriate timing for adoption.
- Understand and help teams apply a range of techniques for data profiling. Sourcing data analysis from a complex single source. You can bring multiple data sources together in a conformed model for analysis.
- Working with OpenSource technology and introducing new technology to solve problems.
- Demonstrates that they are keeping aligned / educated with latest technology and industry trends.
- Real passion for data and technology
- Extensive experience with database technologies and architecture.
- Exceptional problem-solving and critical thinking skills.
- Experience of working with virtual teams and scrum teams
- Ability and initiative to see ambiguity as an opportunity and be able to solve problems, innovate and create.
- Demonstrating a customer first mentality (quick to market, understand business outcome), strive to improve product for the customer.