Junior/Senior Data Engineer Jobs in Pune - Tech9
Job Description
Tech9 is a fast growing custom software development company. We work to represent an ideal in software development delivery. We strive to make each client love us by providing a skilled team, engaging design, solid architecture, and quality implementation. We tackle big challenges with enthusiasm and gusto.
Tech9 India is looking for junior and senior Data Engineers. This is a great opportunity to work with a company that has a primary focus of making our customers happy by delivering value, without all the burdensome policies and rules that have become typical for outsourced software development companies.
If you are looking for a change this is what we can promise you:
- You will have challenging problems to solve
- You will have flexibility and autonomy to solve problems and deliver solutions
- We will provide a highly collaborative environment with skilled and super friendly teammates
- We will fully support you in developing software the right way
- We won't burden you with useless policies and procedures
- We will provide you the tools you need to do your job right
If that sounds attractive please apply! We'd love to talk to you.
Job Overview:
We are looking for savvy Data Engineers to join our growing team. This person will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder who enjoys optimizing data systems and building them from the ground up. This Data Engineer will support our software developers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
Responsibilities:
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Python, SQL and AWS
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Requirements:
- Working knowledge of SQL
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- A successful history of manipulating, processing and extracting value from large disconnected datasets
- Strong project management and organizational skills
- Experience supporting and working with cross-functional teams in a dynamic environment
- Experience with relational SQL databases
- Knowledge of Object-Oriented Development
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience developing in Python
Preferred Experience:
- Experience with data pipeline and workflow management tools like Azkaban, Luigi, Airflow, etc
- Experience with Flask