Big Data Developer
This role has been filled.
Thanks for your interest. Please return to our Development page for other positions you may be interested in, or submit your resume now for future openings.
Our client is looking for a Big Data Developer for several of their projects involving highly scalable and extensible data and BI platforms. Your role is instrumental in operationalizing and aligning the data and analytics technologies into scalable IT & business capabilities.
- Develops and implements complex big data projects and translates business requirements into technical solutions
- Implements and operates stable, scalable, low cost solutions for the data flow
- Improve performance and stability through research and analysis and documentation and maintenance of Big Data systems
- Design, implement and deploy data loading and access mechanisms through various delivery channels of the big data ecosystem
- Administer and implement Big Data platforms and configuration processes
- Perform software, code, and requirements analysis, software review, system risk analysis, software reliability analysis, capacity planning, etc.
- Create effective technical solutions and services through innovative architecture, design and development decisions
- Be an expert on multiple projects concurrently through all phases of the development lifecycle
- Define standards and best practices around big data products and technologies with research and new technologies
- Collaborate with Data Warehouse leaders to define strategy and ensure actionable business intelligence is provided to key stakeholders and BI teams
- Ensure 3rd Party data needs are met in a scalable way by partnering with the Data Warehouse team
- Bachelor's degree in Computer Science, other technical field, or equivalent work experience
- 5+ years’ experience with various Big Data Platforms
- 5+ years of experience of Data and BI Architecture and Management Services
- Expert level experience with Hadoop based components (e.g. HDFS, Hive, Hbase, Oozie, YARN, Sqoop, Zookeeper, Flume, Spark, Kafka etc.)
- Expertise in the design, creation, management, and business use of large datasets during data ingestion and consumption
- Demonstrated skill with installation, configuration and administration of Hadoop platforms
- Expertise in software engineering development and testing life cycles (Linux, Java, scripting, and SQL programming, and others)
- Expertise in capacity planning, deployment, troubleshooting and performance tuning
- Exceptional communication and interpersonal skills
- Financial Services industry experience
- Organized, analytical, and a problem solver
- Proven ability to balance competing priorities and work in in a fast-paced environment
- Experience with Agile or other rapid application development methods
- Team player, coach and mentor when appropriate