Location: Chicago, downtown
Duration: 6 months+
Ideal candidates will be local and able to work a hybrid schedule
- Work on a Snowflake DW design and implementation.
- Design and deploy services to AWS using Terraform and Cloud formation stack.
- Design CICD pipelines for applications and deploy them.
- Design near real-time data pipelines using Kafka streaming service.
- 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics
- 4+ years of experience in architecture for commercial scale data pipelines
- Experience with MDM solutions and integration with other systems
- Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
- Exposure to Amazon AWS or another cloud provider
- Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker
- Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.
- Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase
- Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS, MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)
- Bachelor's degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field