Job Type : Contract
Number Of Applications : 15
Job Posted On : 14/11/2019
Job Posted On : 14/11/2019
Job Duration: 6 months
Hourly Rate: $50 - $59
Daily Rate: $400 - $475
Monthly Salary: $8,800 - $10,450
Our company is in the process of establishing enterprise Data Lake and scaling up capabilities to redesign our legacy batch data pipelines while also enabling near real-time stream processing. To help us on the journey and to shape our future data products as well as their production deliveries and support we are looking for a skilled software engineer with a keen interest in data and its processing at scale. If you consider yourself as a software developer who can take functional requirements, design and develop reusable and maintainable solutions at production level quality we would like to hear from you. Being interested in data, recognizing the significance of robust and fully automated data pipelines (deliveries) and have experience with processing in distributed scalable environments is highly desirable. We offer you the opportunity to challenge the status quo, influence the direction of our future developments and autonomy in designing our future state data solutions which we would like you to own and drive. Soft Skills • Interested in data and finding value in it. • Being independent and eager to take full ownership of the work you do. • Proactive rather than reactive in recognizing and implementing opportunities for improvements. • Ability to deliver end to end solutions. • Enjoying solving complex problems. • Ability to communicate complex designs and solutions in a clear manner to different audiences. • Taking pride in high standards in programming code and documentation quality. • Collaborative and willing to share knowledge. • Ability to lead and upskill junior members of the team. Technical Skills • Software Engineering Skills (Computer Science degree): Robust design, modularity, o Code Reusability, Testing, SDLC • Proven ability to deliver enterprise-level production-ready code: Logging, Alerting, Monitoring, Recovery and re-run ability • Automation: CI/CD, Different levels of testing, Production staging • Programming approaches: OO, Functional, TDD and/or BDD • Programming languages: Java, Scala, Python, SQL, Shell/Bash • Programming code management: Git, Maven, Sonar/Checkmarx, Cloudbees/Jenkins • Familiarity with standard data storage technologies: SQL, NoSQL, Object Storage Bonus • Experience with distributed and scalable data processing frameworks: Spark, Kafka Connect, Kafka Streams, NiFi • Experience in building an automated end to end data pipelines, • Track record of Production Level Real-Time/Stream Processing, • Experience with public and/or private Cloud Environments, • Ability to pioneer solutions where distributed processing is separated from data storage, • Experience with leveraging infrastructure as a code.
Test Triangle is an emerging IT service provider specializing in application testing, DevOps, RPA, Custom software development, mobile app development, Atlassian consultancy, niche IT staff augmentation and training in advanced technologies. Possessing strong experience in different industry verticals such as Banking & Finance, Healthcare, Retail, IT & Education, Test Triangle has developed a unique approach to provide better value to the clients.
3 years ago
2 years ago
3 years ago
3 years ago
3 years ago