Introduction
About Business Unit
Your Role And Responsibilities
Requirements
- Kubeflow, MLflow, Prometheus, Grafana
- Seldon Core, Spark, Kafka
- Pytorch, Tensorflow CUDA
- Computer Vision, Gen AI, LLM
Preferred Education
Required Technical And Professional Expertise
- Develop and deploy machine learning models to enhance the operational efficiency and resilience of payments platforms
- Implement predictive analytics and anomaly detection to identify and resolve issues proactively across the payments value chain
- Build AI-driven solutions to optimize transaction routing, clearing, and settlement processes
- Integrate machine learning models with observability tools like Splunk, Instana, or AppDynamics for enhanced monitoring and insight generation
- Develop risk assessment algorithms to enhance the security and compliance of the payments system
- Work with payments SRE teams to ensure seamless deployment and scaling of AIOps models
- Align ML solutions with industry standards and regulatory compliance frameworks
- Create data pipelines for processing real-time transactional and operational data from payments systems
- Integrate AI solutions with payment orchestration and clearinghouse systems to improve end-to-end efficiency
- Strong expertise in machine learning frameworks and AIOps platforms
- Proficiency in programming languages and data processing tools
- Hands-on experience with cloud-native platforms and containerization technologies
- Experience with logging, monitoring, and observability tools
- Proven experience in deploying ML models for anomaly detection, event correlation, and predictive analytics.
- Hands-on knowledge of LLM-based systems and LLM observability/security
- Experience with event correlation and supervised learning techniques
Preferred Technical And Professional Experience
- Proficiency in Python, R, or Java for AI/ML model development and testing
- Strong data modeling skills, including the ability to design and implement normalized and denormalized schemas
- Proficiency in encryption technologies and secure data handling, including experience with encryption protocols, data masking, and access control mechanisms
- Experience with data integration tools and frameworks for real-time and batch data processing
- Knowledge of cloud platforms and cloud-native database services, including serverless computing, data lakes, and containerized data workloads.
- Strong troubleshooting skills to identify and resolve performance, integration, and data quality issues in complex data ecosystems
- Ability to work effectively with cross-functional teams, ensuring that data solutions meet business and technical requirements