We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.

Job posting has expired

#alert
Back to search results

Lead Data Engineer

State Farm
Jan 25, 2025

Lead Data Engineer

US-GA-Dunwoody

Job ID: 2025-40074
Type: Regular Full Time
# of Openings: 1
Category: Technology and UX
Dunwoody, GA

Overview

Being good neighbors - helping people, investing in our communities, and making the world a better place - is who we are at State Farm. It is at the core of how we operate and the reason for our success. Come join a #1 team and do some good!



Responsibilities

State Farm (Dunwoody, GA) seeks a Lead Data Engineer to design and build out cloud-based data analytic capabilities for monitoring and reporting utilizing a mix of vendor supplied and custom developed application tooling.
Specific duties include: understand how technology solutions meet business outcomes and offer a range of solutions for business partners; participate in sprint planning, providing work estimates to deliver product stories and owning development stories; complete required coding to satisfy acceptance criteria and deliver desired outcomes;participate in solution design, considering risks, mitigations, performance, user experience, and testability; assist in the development of automated testing and support code as necessary; complete required documentation to communicate information to deployment, maintenance, and business teams; utilize agile software development practices, data and testing standards, code reviews, source code management, continuous delivery, and software architecture; participate in the full software development cycle, including coding, testing, implementation, support, and sunset; design, develop, test, and support software in support of product objectives; consider applying emerging technology solutions to increase efficiency and effectiveness; resolve problems that result in decreased time to market, improved quality, and enhanced flexibility; provide input into the overall testing plan, and contribute to the test approach and scenarios for requirements; exhibit a DevOps mindset, where the team is accountable for the product from inception to sunset. Must be certified in one of the following: Splunk Power User, Hadoop, AWS CCP or CCA, Microsoft Azure, Good Cloud Professional, or similar Cloud based certification. Must take and pass pre-hire Python coding test. Option to work in a hybrid environment with required travel to one of the following hub offices once a quarter for planning sessions and other team/area meetings: Bloomington, IL, Atlanta, GA, Dallas, TX, or Phoenix, AZ. Must live within 180 miles of one of these hub offices.



Qualifications

Position requires a Bachelor's degree, or foreign equivalent, in Computer Science, Information Systems, or a closely related field of study, plus 5 years of experience the job offered, or as a Programmer Analyst, or similar position in software development and data analysis. In the alternative, will accept a Master's degree, or foreign equivalent, in Computer Science, Information Systems, or a closely related field of study, plus 2 years of experience the job offered, or as a Programmer Analyst, or similar position in software development and data analysis. Specific experience must include: working with NodeJS, Python or Java backend service languages; using multiple operating systems such as Windows Server, Linux/Unix including Virtual Machine setup and configuration management; using industry tools, frameworks, and languages such as Java, J2EE, Spring Framework (Spring, Spring Boot, etc.), SQL, JavaScript, SOAP, REST, XML, WSDL, CXF, JSON, Junit, Mockito, EasyMock, Splunk, Redis, LDAP, Vault, Swagger, React, Node.js, Spark, or Python; working with DevOps using Git and software pipelines; creating pipelines/automation to deploy changes to production with quality and DevOps best practices (GIT, GitOps, etc.), resiliency, alerting and monitoring; using databases and tooling, including DB2, IMS, SQL, PostgreSQL, pgAdmin, including Cloud based database services such as RDS, DynamoDB or MongoDB; developing and enhancing Machine Learning based applications; working with CI/CD Continuous Delivery Pipeline (Gitlab, CI/Jenkins/Urban Code); using Splunk, Quicksight dashboard development or Dynatrace reports for monitoring; and working with Spark, Scala, Hadoop or similar data analytics capability tooling. Must be certified in one of the following: Splunk Power User, Hadoop, AWS CCP or CCA, Microsoft Azure, Good Cloud Professional, or similar Cloud based certification. Must take and pass pre-hire Python coding test. Option to work in a hybrid environment with required travel to one of the following hub offices once a quarter for planning sessions and other team/area meetings: Bloomington, IL, Atlanta, GA, Dallas, TX, or Phoenix, AZ. Must live within 180 miles of one of these hub offices.

Full time position. Apply by submitting resumes at https://jobs.statefarm.com/main/jobs, Job ID: 40074

#LI-DNI



Please see job description

PI260680128

(web-6f6965f9bf-7hrd4)