Beginner to intermediate experience with SAP BODS data flow development, testing and deployment is required. Experience in developing SAP BODS jobs with BW/HANA as source and SQL server as the target is preferred.
Strong experience working with a variety of relational SQL and NoSQL databases
Strong experience working with big data tools: Big Data tech stack (Hadoop, Spark, Kafka etc.)
Experience with at least one cloud provider solution (AWS, GCP, Azure, GCP preferred)
Strong experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Strong hands-on experience in Databricks (Unity Catalog, workflows, Optimization techniques)
Working knowledge in any transformation tools, DBT preferred.
Ability to work with Linux platform.
Strong knowledge of data pipeline and workflow management tools
Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation
Experience creating Data pipelines that prepare data for ingestion & consumption appropriately • Experience in setting up, maintaining and optimizing databases/filesystems for production usage in reporting, analytics.
Experience with workflow orchestration (Airflow, Tivoli etc.)
Working knowledge of Git hub /Git Toolkit • Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well (Good verbal and written English)
Relevant working experience with Containerization (Docker and Kubernetes) preferred
Experience working with APIs (Data as a Service) preferred