Job description
-
Job ID:
3987 -
Pay rate range:
$80 - $95 -
City:
Chicago -
State:
Illinois -
Duration:
09/18/2022 - 03/18/2023 -
Job Type:
Contract -
Job Description
Data Architect Cloud
Chicago local preferred but might consider remote
80 W2 - $95C2C (1.5x overtime)
6 months
Must have this experience:
This project is integrating new data from new sources to the Data Platform in Snowflake. Set up integrations for new data sources. Implementing infrastructure to support those integrations in AWS. 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics- 4+ years of experience in architecture for commercial scale data pipelines. Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker
PURPOSE:
we’re working to Advance Care through data-driven decisions and automation. This mission serves as the foundation for every decision as we create the future of travel. We can’t do that without the best talent – talent that is innovative, curious, and driven to create exceptional experiences for our guests, customers, owners and colleagues.
Company seeks an experienced Data Architect who will be an exceptional addition to our growing engineering team. The Data Architect will work closely with data engineering, data product managers and data science teams to meet data requirements of various initiatives.
As a Data Architect, you will Lead the creation of the strategic enterprise data architecture. Partners with internal stakeholders to define the principles, standards, and guidelines regarding data flows, data aggregation, data migration, data curation, data model, data consumption and data placements. Provide expertise regarding data architecture in critical programs, data strategy and data quality remediation activities. Validates data architecture for adherence to defined policies, standards and guidelines including regulatory directives.
You will be a part of a ground-floor, hands-on, highly visible team which is positioned for growth and is highly collaborative and passionate about data.
This candidate builds fantastic relationships across all levels of the organization and is recognized as a problem solver who looks to elevate the work of everyone around them.
Provides expert guidance to projects to ensure that their processes and deliverables align with the target state architecture.
Defines & develops enterprise data architecture concepts and standards leveraging leading architecture practices and advanced data technologies.
Requirements gathering with business stakeholders, domain: agile team work, people data, and hierarchies of project portfolio work
Ability to write requirements for ETL and BI developers
Ability to write designs for data architecture of data warehouse or data lake solutions or end to end pipelines
Expert in data architecture principles, distributed computing knowhow
Intake prioritization, cost/benefit analysis, decision making of what to pursue across a wide base of users/stakeholders and across products, databases and services,
Design or approve data models that provide a full view of what the technology teams are working on and the business impact they are having.
End to end data pipeline design, security review, architecture and deployment overview
Automate reporting views used by management and executives to decide where to invest the organization’s time and resources, and stay up to date on key company initiatives and products
Create self-service reporting including a data lake for internal projects and resources
Design for comprehensive data quality management tooling.
The ideal candidate demonstrates a commitment to core values: respect, integrity, humility, empathy, creativity, and fun.
QUALIFICATIONS:
6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics
4+ years of experience in architecture for commercial scale data pipelines
Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners
Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
Exposure to Amazon AWS or another cloud provider
Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker
Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.
Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills
Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders
Rigorous attention to detail and accuracy
Aware of and motivated by driving business value
Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase
Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)
Bachelor’s degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field
Advance CS degree is a plus knowledge and experience
#PCIT #LI-Remote
colinoncars.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, colinoncars.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, colinoncars.com is the ideal place to find your next job.