Loading
Amr Ali Eissa

Sr. Data Engineer

Sr. Software Engineer

Amr Ali Eissa

Sr. Data Engineer

Sr. Software Engineer

Unleash the Power of Kafka Streaming for Real-Time Data

I am a Sr. Big Data Engineer Expert in real-time data streaming and scalable data solutions. Skilled in Apache Kafka, Flink, Spark and Iceberg. Passionate about transforming complex data into actionable insights, optimizing workflows, and driving business success.
Java Developer
Work Experience
2022-Now
Big Data Engineer
Contractor at eSolutions for Alrajhi Bank
  • Developed and deployed using OpenShift a java application that exposes API to receive messages from
    Session logging platform then parse these messages and do some transformation then produce to Kafka, the challenge was that latency needed to be minimal as Realtime is required and the number of transactions was big, so used Async calls to produce and configured Kafka producer to ensure best throughput.
  • Developed and deployed many Kafka connectors (source & sink) with custom single message transformation for many business requirements: Parsing JSON, XML, creating key based of concatenation of multiple fields casting field types and removing unwanted fields from message or using masks.
  • Created a helper tool using java to consume messages from Kafka based on start from date and query parameters to retrieve specific messages only based on filters and is accessible through API call so for High level users it was very easy to query from Kafka.
  • Developed and deployed on production many Python scripts alerts to monitor real time events from Kafka
    topics and send reports to management.
  • Used KSQL to create two streams from two topics then join them and create an enriched stream.
  • Created Kafka streams application that filters a stream of data then enriches the stream from external DB call then apply some transformation and aggregation then take decision and sink it to another Kafka topic.
  • Created Flink application that read from Kafka source then apply some filter then write to Kafka sink.
  • Created Flink application that reads from Kafka source then enriches it from Redis using Async IO then apply some filters then write it to Kafka sink.
  • Created Flink application that uses complex event processing library (Flink CEP).
  • Created Flink Application that aggregates messages from Kafka source using both window aggregation and stateful aggregation.
  • Created ETL jobs using Apache Spark both batch and processing modes.
2020-2022
Full-stack developer
Sigma EMEA
  • Implemented solutions for both backend (Spring, Spring Boot, …) and frontend (Angular, Vue) for many multi-national companies in the region like: Huawei, WE, Ethio-Telecom, Mobilis, Malitel.
  • Used functional programing heavily to interact with lists and maps and apply business logic.
  • Consumed from Kafka topic and sent message as notification for customers of My WE Apps.
  • Completely Revamped the frontend web App from Angular 4 to Angular 8.
2021-2023
Development Team Lead
Tofaha
  • Led the backend development team for a fast-growing e-commerce startup, focusing on Magento 2.
  • Developed an online payment module integrating Egyptian National Bank APIs for seamless transactions

My Skills

Data Technologies
Apache Kafka
Apache Kafka
100%
Apache Spark
Apache Spark
80%
Apache Flink
Apache Flink
100%
Apache Iceberg
Apache Iceberg
70%
Nessie
Nessie
80%
Apache Airflow
Apache Airflow
60%
MinIO Object Store
MinIO Object Store
70%
Apache Superset
Apache Superset
60%
Trino
Trino
70%
Custom Connector
100%
KSQL
70%
Data modelling
70%
Java
Java
100%
Python
Python
80%
Scala
Scala
70%
REST APIs
100%
gRPC APIs
80%
SOAP APIs
70%
Spring Boot
Spring Boot
90%
Angular
Angular
80%
Leadership
Fast Learning
Professional Writing
Presentation Skills
Time Management
Fluent English

Contact