Data Engineer Intern – Snowflake

Data Engineer Intern

About the internship

  • Build & maintain data pipelines using Apache Airflow or custom scripts. 
  • Manage and improve the data integrity and reliability of data services. 
  • Build reliable ingestion frameworks  to onboard new data into our snowflake data warehouse. 
  • Foster collaboration among engineering, security compliance, IT & other business groups to ensure data is secure and audit-able. 
  • Train distributed team members in data pipelines. 

Skill(s) required

  • Excellent understanding of Database modelling and SQL knowledge
  • Consuming REST APIs using python 
  • Experience writing jobs in Airflow using python.
  • ELT based data pipeline build outs is useful..
  • Strong communication and cross functional collaboration skills

Who can apply

1. Are available for full time (in-office) internship

2. MS in Computer Science or equivalent practical experience.

3. Writing solutions using Python & SQL

4. Have relevant skills and interests

Additional information

Excellent problem-solving ability and attention to detail

Location: Pune, India 

Number of openings

N/A 

About Company

Snowflake delivers the Data Cloud — a global network where thousands of organizations mobilize data with near-unlimited scale, concurrency, and performance. Inside the Data Cloud, organizations unite their siloed data, easily discover and securely share governed data, and execute diverse analytic workloads. Wherever data or users live, Snowflake delivers a single and seamless experience across multiple public clouds. Snowflake’s platform is the engine that powers and provides access to the Data Cloud, creating a solution for data warehousing, data lakes, data engineering, data science, data application development, and data sharing. Snowflake customers, partners, and data providers already taking their businesses to new frontiers in the Data Cloud.