The job below is no longer available.
You might also like
in Greenwood Village, CO
Systems Engineer IV – Application Analyst
•30 days ago
Hours | Full-time, Part-time |
---|---|
Location | Greenwood village, Colorado |
About this job
JOB SCOPE
Primarily responsible for ongoing development of enhancements of various data reconciliation projects, centering around Java, SQL, Hive, Hadoop, and Dart. Implement critical reporting for senior leadership, and various internal clients. Splunk and SQL Developer are heavily used within the team, and requires a high level of knowledge. The role is very heavy in analytics, as the team is constantly working to uncover system defects, and trend data for success rates, defects, anomalies and many other data points.
DUTIES AND RESPONSIBILITIES
Primarily responsible for ongoing development of enhancements of various data reconciliation projects, centering around Java, SQL, Hive, Hadoop, and Dart. Implement critical reporting for senior leadership, and various internal clients. Splunk and SQL Developer are heavily used within the team, and requires a high level of knowledge. The role is very heavy in analytics, as the team is constantly working to uncover system defects, and trend data for success rates, defects, anomalies and many other data points.
DUTIES AND RESPONSIBILITIES
- Responsibilities include troubleshooting, reporting on, and supporting the various Flow-Through Provisioning systems owned by the APO Group.
- Analyze release notes for upcoming code releases and report any issues found with the logic and/or MOPs.
- Manage and execute development enhancements for
- Work with SQL developers to improve current reporting and find potential for new reports.
- Work with Java and Python Developers to provide requirements for new tools.
- Collaborate with many groups, including IT, Advanced Engineering, Product, Business Analysts, and PAC.
- Perform other duties as requested by manager.
- Bachelor's degree in Computer Science, Engineering or related field, and / or equivalent work experience.
- Minimum six (6) years of engineering work experience.
- Minimum six (6) years of experience with SQL Developer.
- Minimum six (6) years of experience Python.
- Minimum two (2) years of experience in the cable industry.
- Minimum two (2) years of experience working with Big Data technologies.
- Minimum two (2) years of experience working with flow-through back-end systems.
- Minimum one (1) year of experience with Splunk as a Power User.
- Senior Big Data Architect with strong skills in Python/Scala, SQL, SPARK & UNIX/LINUX
- Hands on experience with Kafka and/or streaming applications
- Working knowledge of cloud, containerization, micro services architecture and serverless architecture
- Understanding of Application security in On-premise and Cloud environments
- Experience is converting the functional requirements to Technical requirements
- Ability to drive the Architecture and solution discussion with IT executives, as well as, mentor and guide technical teams
- Hands on experience with AWS/Azure