1225 Washington Pike
Bridgeville,
PA
15017
US
| Fully Remote
TELECOMMUTE
US
Posted: 03/20/2023
2023-03-20
2023-04-30
Job Category: Database
Job Number: 6676
Job Description
Responsibilities
- Designs and develops quality Data Warehouse solutions
- Develops high quality, scalable data pipelines and data processes in Azure distributed cloud environment
- Conducts testing, code reviews, data integrity, and performance optimization
- Creates and maintains technical design documentation
- Lead requirements gathering for data modelling and contribute to data architecture
- Creates and promotes writing good quality code for accurate data
- Supports developers, data analysts, business partners, and data scientists who needs to interact with data platform
- Responsible for production support, including analyzing root cause and developing fixes to restore ETL and data operational readiness, planning and coordinating maintenance, conducting audits, and validating jobs and data
- Mentors other team members, cross-trains and provides guidance
- Solid understanding of work estimation process to lead large/complex estimation activities
- Meets expectations in meeting deadline within budget, schedule, and appropriate quality
- Adhere to enterprise architecture standards and contribute to making development and testing standards
- Maintain pipelines in a git repository
- Learn and contribute to solving data problems in an experienced manner
- Contribute to a collaborative work environment within and across teams
Basic Qualifications
- Experience in Data Warehouse design and data modeling patterns (on-premise or cloud)
- Experience in developing SQL/Synapse data warehouses and T - SQL coding
- Experience in developing/supporting a data platform in Azure with data lake, Azure SQL server or Synapse
- Experience with Azure/AWS and Databricks
- Experience with ETL tools such as ADF or equivalent
- Experience in python or Scala or any other language in distributed cloud environment.
- Strong experience in performance tuning of ADF jobs, SQL, with medium and large volumes of data
- Highly skilled in ETL tools such as SSIS, Informatica, Talend, AWS Glue, Azure Data factory or equivalent
- Expert in creating T SQL or pgSQL or PL/SQL or equivalent in processing big data
- Well-rounded in working in a DEV Ops environment supporting processes in data platform in supporting business units
- Thorough knowledge in core data concepts providing solutions for business use cases in distributed computing environment
- Expert level knowledge in writing python or Scala code in a distributed computing environment handling big data loads in lake house and delta lake environment
- Strong knowledge in performance improvement methods in data processes
- Experience working in an agile Data Warehouse team with 5+ members
- Knowledge about BI tools such as power BI or equivalent in supporting Data Warehouse development, testing and operational support activities
- Excellent written and verbal communication skills
- Ability to work independently, handle multiple tasks simultaneously and adapt quickly to change with a variety of people and work styles
- Must be capable of fully yet concisely articulating technical concepts to non-technical audiences
- Keen on learning new concepts and keep up to date with emerging technical stack
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Share This Job:
Login to save this search and get notified of similar positions.