Support Engineer - Data Operations - Berwyn, PA

Yodlee - Berwyn, PA (30+ days ago)3.9


You will have -
Bachelor’s degree in Computer Science
3-5 years of support industry experience
Expertise in L1, L2 support model with flexibility to support data issues and product delivery, during the weekends as needed.

You will definitely possess these technical skills –
Hands-on experience with Incident Management Systems and Processes
Linux Proficiency – working & developing
Strong Shell scripting skills
Knowledge of Perl, PHP or Python programming
Expertise with Nagios, Foglight, Cloudera Manager
Basic SQL and PLSQL coding skills
Experience with Hadoop and AWS EC2 Knowledge
Oracle

Nice for you to have (but not necessary) –
Experience with MYSQL database and Queries
Experience with MongoDB or NoSQL databases
Operational familiarity with Elastic Search, EMR, Spark, Web Services
Experience with Redshift and Tableau systems is a plus
Splunk dashboards is a big plus

Bird’s eye view of your role –
Yodlee Data Operations team ensures Data Quality and Integrity, to fuel the future growth of Yodlee Services. This is an opportunity to be part of a team that delivers rock solid and stable solutions to enable Yodlee Analytical functions. The team strives to reduce the risks of Yodlee's operations by introducing new monitoring features and optimizing Yodlee Environment to perform efficiently. This team develops, implements tools and monitors to detect data defects early and build detailed futuristic dashboards to create a holistic view for Data Products. Yodlee Analytical Functions heavily leverage AWS environment for daily jobs.

We are looking for highly motivated, skilled and detail, results oriented support engineer to monitor the environment and respond to alerts in a timely manner. You will play a critical role in actively solving production issues with analytical processes and products. Be prepared to drive the quality and SLA of product delivery. You will leverage technology stack like python, Perl, Linux/Shell, Splunk, Hive, Redshift, Tableau, Oracle, SQL/MYSQL and quickly grasp new technologies. You will also work with cross functional teams, to understand and maintain analytical products in production.

What will you bring –
Passion for data, ability to adapt quickly to financial data streams and patterns. Strong competency responding to alerts, analysis, identifying data defects, strong troubleshooting skills. You are a self-starter, team player with superb communication skills. You love to learn new software, frameworks and tools.

What your performance objectives will be –

Understand Yodlee infrastructure & financial data components which include tools, dashboards, monitors and alerts that are developed using Linux, Perl, PHP, Python, MYSQL, Oracle
Learn/Enhance monitoring skills around Splunk, Hadoop, Redshift, Tableau based on projects needs, quickly.
Understand big data processing components and support them efficiently.
Effectively follow Shift allocation and execute L1, L2 support procedures.
Work with Cross Functional Team, to root cause and resolve data issues.
Execute L3 Corrective procedures as requested/needed on production systems.
Execute SQL/PLSQL, to fix any data defects post the root cause.
How will your lofty goals be translated into specific actions / short term goals –

Within the first 30 days, you will learn Yodlee Data Pipeline and understand existing big data components/analytic tools and reports available.
During the first quarter, you will evolve as a strong contributor, to the Data Operations team by supporting the big data system solutions—on-premise and cloud. In parallel, you will also contribute to new procedures/Runbook documentation. Ensure SLA/delivery by reacting to alerts promptly. Be self-driven in closing all incidents end to end.
By the end of six months, you will be an established team member and start working independently. Make recommendations to Yodlee data functions and processes to maintain a healthy data environment.
What’s in it for you? (EVP – Employee Value Proposition) –

An opportunity to make a difference in Yodlee's Data Culture by building a robust data delivery Platform.
Sharpen your analytical skills by working on a very rich and voluminous financial data domain.
Gain expertise in niche big data skill sets like Hadoop, RedShift, Splunk, Tableau, Hive, Pig and AWS, under a single Yodlee Umbrella.