Job Description:
Specialist, Data Engineering
Overall Purpose
This position will interact on a consistent basis with other developers, architects, data product owners and source systems. This position requires multifaceted candidates who have experience in data analysis, visualization, good hands-on experience with BI Tools and relational databases, experience in data warehouse architecture (traditional and cloud).
Key Roles and Responsibilities
• Develop, understand, and enhance code in traditional data warehouse environments, data lake, and cloud environments like Snowflake, Azure, Databricks
• Build new end-to-end business intelligence solutions. This includes data extraction, ETL processes applied on data to derive useful business insights, and best representing this data through dashboards.
• Write complex SQL queries used to transform data using Python/Unix shell scripting
• Understand business requirements and create visual reports and dashboards using Power BI or Tableau.
• Upskill to different technologies, understand existing products and programs in place
• Work with other development and operations teams.
• Flexible with shifts and occasional weekend support.
Key Competencies
• Full life-cycle experience on enterprise software development projects.
• Experience in relational databases/ data marts/data warehouses and complex SQL programming.
• Extensive experience in ETL, shell or python scripting, data modelling, analysis, and preparation
• Experience in Unix/Linux system, files systems, shell scripting.
• Good to have knowledge on any cloud platforms like AWS, Azure, Snowflake, etc.
• Good to have experience in BI Reporting tools – Power BI or Tableau
• Good problem-solving and analytical skills used to resolve technical problems.
• Must possess a good understanding of business requirements and IT strategies.
• Ability to work independently but must be a team player. Should be able to drive business decisions and take ownership of their work.
• Experience in presentation design, development, delivery, and good communication skills to present analytical results and recommendations for action-oriented data driven decisions and associated operational and financial impacts.
Required/Desired Skills
• RDBMS and Data Warehousing (Required 4-6 Years)
• SQL Programming and ETL (Required 4-6 Years)
• Unix/Linux shell scripting (Required 3-4 years)
• Power BI / Tableau (Desired 2 years)
• Python or any other programming language (Desired 2 years)
• Cloud Platforms - Azure, Snowflake, Databricks, Deltalake (Desired – good to have experience)
Education & Qualifications
University Degree in Computer Science and/or Analytics
Minimum Experience required: 4-6 years in relational database design & development, ETL development
Weekly Hours:
40
Time Type:
Regular
Location:
Bangalore, Karnataka, India
It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities.
AT&T will consider for employment qualified applicants in a manner consistent with the requirements of federal, state and local laws
We expect employees to be honest, trustworthy, and operate with integrity. Discrimination and all unlawful harassment (including sexual harassment) in employment is not tolerated. We encourage success based on our individual merits and abilities without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, disability, marital status, citizenship status, military status, protected veteran status or employment status