Careers
Please go through the current job opportunities with us and send us your profile to careers@aiwhiz.co
-
Senior Big Data Engineers
- Conceptual knowledge of data and analytics, such as dimensional modelling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data to solve complex data problems
- Hands-on experience modelling, design, configuration, installation, performance tuning, and setting up sandbox for POC
- Work with business/application/solution teams to implement data strategies, build data flows, & develop conceptual/logical/physical data models
- Develop, and maintain conceptual, logical, and physical data models, implement of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms in cloud
- Lead scoping sessions to generate estimates and approaches for execution
- Minimum 4+ years of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders
- 5+ years of data engineering or architecture experience, architecting, developing, and deploying scalable enterprise data analytics solutions, design/development of Large Data Warehouse and/or Database Management Systems
- 2+ years of hands-on experience architecting and designing data lakes on cloud serving analytics and BI application integrations (Azure or GCP)
- 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols)
- Experience with tools like Azure Data Factory, Databricks, Integration Runtime Services
- Experience in Spark, Python or Scala, pySpark, Expert in T-SQL
- Big Data development experience using Kafka, Azure EventHub, IoT Hub
- Minimum 1+ years of experience introducing and operationalizing self-service data preparation tools (e.g. Collibra, Alation, Babylon)
-
Big Data Engineers
- Hands-on experience modelling, design, configuration, installation, performance tuning, and setting up sandbox for POC
- Work with business/application/solution teams to implement data strategies, build data flows, & develop conceptual/logical/physical data models
- Develop, and maintain conceptual, logical, and physical data models, implement of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms in cloud
- Lead scoping sessions to generate estimates and approaches for execution
- Minimum 2+ years of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders
- 2+ years of data engineering developing, and deploying scalable enterprise data analytics solutions, design/development of Large Data Warehouse and/or Database Management Systems
- 2+ years of hands-on experience architecting and designing data lakes on cloud serving analytics and BI application integrations (Azure or GCP)
- 2+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols)
- Experience with tools like Azure Data Factory, Databricks, Integration Runtime Services
- Experience in Spark, Python or Scala, pySpark, Expert in T-SQL
- Big Data development experience using Kafka, Azure EventHub, IoT Hub
-
Freshers
- Ambitious B.Tech / B.E / MCA / M.Tech graduates from various engineering institutions
- Must have passed out in 2020 / 2021
- Background in Programming tool like C, C++, Java, C#, Python etc
- Knowledge and exposures with RDBMS / DBMS concepts and techniques
- Attitude to understand, learn and work on new tools and technologies
- Desire to work in Data, Data Engineering, Data Science, Full-Stack and Front-end technologies
- Good Communication skills
- A minimum cut-off aggregate of 60% and above
- The person should be ready to take responsibilities and ownership