Freshersworld does not charge any amount for job placement. Beware of fraudsters who ask you to pay on the pretext of giving a job. Know More

Post A Job

DevOps Engineer (DevOps + Bigdata) Jobs in Pune - Acquia

DevOps Engineer DevOps Bigdata

Acquia
experience-icon 0 to 3 Years
salary-icon Not disclosed
qualification-icon BCA, BE/B.Tech, Other Course
Expired

Posted: 01 Mar 21

Job Description

DevOps Engineer w/ Big Data

Job description summary:

The DevOps is responsible for crafting and delivering secure and highly available solutions. You will be a critical part of a team passionate about ensuring our critical services are ready and stress tested. You should be comfortable taking on new challenges, defining potential solutions and implementing designs in a team environment. This position will be expected to both guide and support the team's growth and learning.

What You Will Accomplish
  • The DevOps partners closely with Engineering, Support, and OPS. We are responsible for the design, deployment, and continuous operation of the AgilOne platform.
  • You will evolve our existing platform to the next level with CI/CD, automated diagnostics/scaling/healing, and more.
  • You will work on a team responsible for a blend of architecture, automation, development, and application administration.
  • You will build and deploy solutions from the infrastructure, to the network, and application layers, on public cloud platforms.
  • You will ensure our SaaS platform is available and performing, and that we can notice problems before our customers.
  • You will build the tools to improve speed, confidence and visibility of our SaaS deployments.
  • You will help build security into every step of the software & infrastructure life cycle.
  • You will collaborate with Support and Engineering on customer issues, as needed.
  • Building and maintaining re-deployable cloud and on-premise infrastructure;
  • Working with distributed data infrastructure, including containerization and virtualization tools, to enable unified engineering and production environments;
  • Developing dashboards, monitors, and alerts to increase situational awareness of the state of our production issues/sla/security incidents.
  • Independently conceiving and implementing ways to improve development efficiency, code reliability, and test fidelity.
  • You will participate in a periodic on-call rotation

Qualifications
  • Must have deploying, tuning, and maintaining Linux-based, highly available, fault-tolerant platforms in public cloud providers such as AWS, Azure, and GCP
  • Must have in depth knowledge of big data technologies: hadoop, hdfs, hive, spark, kafka, yarn, zookeeper, etc
  • Must be comfortable with common configuration management & orchestration tools. Experience with or ability to learn Ansible, and AWS/GCP services & APIs.
  • Understanding sql queries and how they work.
  • The ability to dig deep into infrastructure and code to tackle problems.
  • A DevOps mentality.
  • The drive to tackle traditional operations problems through automation.
  • Familiarity with a modern programming language. Experience with or an ability to learn Go lang, Python and Linux shell scripting
  • Enjoy learning new tools, and languages.
  • Enjoy a collaborative environment.
  • High attention to detail.
  • Strong customer focus.
  • An enthusiastic self-starter with a commitment to learning, customer empathy, and team communication.
  • A Bachelors in Computer Science, Engineering, MIS, or experience in software engineering or a related field
Desire
  • Experience with virtualization technologies, kubernetes, and docker, etc.
  • Standard methodologies in infosec, SOC 2, HIPAA preferred.
  • Familiarity with common monitoring, log aggregation and metrics capturing platforms (Nagios, Sensu, Splunk, Sumologic et al.)

Experience
  • Hadoop/hive/hdfs/spark/kafka/yarn: +5 years
  • Has previous built or was involved in building a CI/CD Pipeline: +5 years
  • Continuous delivery/integration tools (Jenkins, Spinnaker, Artifactory): +5 years
  • Hands-on Unix/Linux knowledge: +5 years
  • Writing build scripts using Python, Terraform, Unix Shell (bash,ksh): +5 years
  • DevOps and/or build & release experience including delivery: +5 years
  • Automation/configuration management using Ansible: +3 years
  • Software Configuration Management tools: +3 years
  • DB/Data Platforms: Aurora/Mysql: +3 years
  • AWS capabilities and architecture: +3 years
  • Modern application monitoring tools: +2 years

Job Particulars

Education BCA, BE/B.Tech, Other Course
Who can apply Freshers and Experienced (0 to 3 Years )
Hiring Process Face to Face Interview
Employment Type0
Job Id1132974
Locality Address
Country India

About Company

Acquia
Jobs By Location
Job & career videos
scroll-icon scroll-icon
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
scroll-icon youtube-img
ARE YOU A FRESHER? REGISTER NOW
Looking for your first Dream Job?
Update Resume
Upload Resume