Senior Data Engineer/Architect/Spark/Databricks ll
Enable Data Incorporated
Minneapolis, minnesota
Job Details
Contract
Full Job Description
Job Responsibilities:
- Design, code, test, document, and maintain high-quality and scalable Big Data solutions in public and on-prem cloud.
- Research, evaluate, and deploy new tools, frameworks and patterns to build sustainable Big Data platform.
- Identify gaps and opportunities for improvement of existing solutions
- Define and develop APIs for integration with various data sources in the enterprise
- Analyze and define customer requirements.
- Assist in defining product technical architecture.
- Make accurate development effort estimates to assist management in project and resource planning.
- Create prototypes, proof-of-concepts & design and code reviews
- Collaborate with management, quality assurance, architecture, and other development teams
- Write technical documentation and participate in production support.
- Keep skills up to date through ongoing self-directed training.
- The ideal candidate will be a self-starter who can learn things quickly who is enthusiastic, active, and eager to learn.
Requirements
Required qualifications:
- Bachelor's degree in a technical field, or equivalent experience.
- 10+ years of hands-on Software Development experience using Java or Scala.
- 8+ years of experience in Hadoop, MapReduce, HDFS, Spark, Streaming, Kafka and NoSQL
- 8+ years of experience in at least one major public cloud platform (Azure or AWS preferred)
- 8+ years of previous relational database experience working with DB2, Oracle or SQL Server
- Hands on experience with Databricks
- Thorough understanding of service-oriented architecture (SOA) concepts.
- Previous experience with Agile/SCRUM methodology/best practices
- Previous experience and successful track-record of learning new tools and technologies and leveraging these on integration and implementation projects
- Databricks certification preferred