Eligibility:
- B.E/B.Tech, M.Tech, MCA From 2022 batch with 60% and above throughout academics
- No backlogs/re-appears
Salary:
Stipend during internship for 4 months:
- INR 23,000 Per Month (Bangalore & Mumbai),
- INR 20,000 Per Month (Trivandrum)
CTC Post internship / At FTE conversion:
- INR 8.5 LPA for Bangalore & Mumbai (6.5L fixed + 2L QCDP bonus)
- INR 5LPA for Trivandrum (4L fixed + 1L QCDP)
Work Location: Bangalore/Mumbai/Trivandrum
Interview process: (Virtual - Video)
- HackerEarth test
- Personal Interviews
WFH/WFO: We are working on Hybrid mode
1. Platform Engineer
Responsibilities:
- Scale distributed applications, make architectural trade-offs applying synchronous and asynchronous design patterns, write code, and deliver with speediness and quality.
- Design and develop scripts to build and deploy services using automated tools with focus on developing hooks for operations and monitoring support.
- Provide guidance to the team to implement services and systems that are highly available, scalable, and self-recoverable on cloud platforms
- Design and deploy metrics, monitoring, and logging systems to understand the system performance and isolate bottlenecks.
- Introduce new cloud technologies, tools & processes to keep innovating in commerce area to drive greater business value
Requirements:
- Strong fundamentals in object-oriented design, data structures, algorithms, complexity analysis distributed computing and operating systems etc.
- Proficiency in, at least, one modern programming language such as Java, C, C++, C#, or Perl
- Experience in Scripting language like Bash/shell scripting
- Understand project-specific requirements, standards, guidelines, and processes
- Demonstrated success working in a team-based environment
- Good written/oral communication skills
- Strong analytical and problem solving skills
2. Data Engineer
Responsibilities:
- Work with business users and other stakeholders to understand business processes.
- Ability to understand , design and implement Dimensions and Fact tables
- Identify and implement data transformation/cleansing requirements
- Profile source data to check data quality and develop Data Pipelines to implement ETL logic
- Create and execute test plans, debug existing Data Pipelines
- Create data visualizations
- Write SQL code - Queries/Stored Procedures/Functions
Requirements:
- Strong SQL writing skills
- Strong knowledge on Data Warehouse and ETL concepts
- Familiarity with Kimball and Inmon Data Warehouse methodology
- Understand project-specific requirements, standards, guidelines, and processes
- Demonstrated success working in a team-based environment
- Good written/oral communication skills
- Strong analytical and problem solving skills