Location: Auburn Hills, MI
Duration: Long Term
Our end client is a one of the largest financial services companies headquartered in Dallas, Texas, strategically aligned into three major business segments: the Business Bank, the Retail Bank, and Wealth Management. It has retail banking operations in Texas, Michigan, Arizona, California and Florida, with select business operations in several other U.S. states, as well as in Canada and Mexico.
Job Responsibilities
- Techno functional data architect responsible for defining the business requirements, designing data solutions, and implementing them using AWS tools and technologies.
- Lead data quality improvement initiatives in organization by establishing a data governance and stewardship structure.
- Design and develop databases, data models, analytics and other strategies on enterprise data lake and data warehouse while coaching a team of junior analysts and data engineers.
- Design and build custom reporting, visualization and dashboards to provide data-driven decision making to senior leadership.
- Deep dive into large data sets to solve key business problems using SQL, and other data manipulation languages.
- Own the development and maintenance of new and existing artifacts focused on analysis of requirements and key metrics.
- Continually improve ongoing reporting and analysis processes, while automating or simplifying self-service support and access to authoritative data sets.
- Partner with execution/business teams to consult, develop and implement KPIs, automated reporting/process solutions, and process improvements to meet business needs.
- Translate complex or ambiguous business problem statements into analytic requirements.
- Responsible for critical data processing and enrichment using Python programming language on AWS EMR Big Data platform.
- Leverage Cucumber Behaviour Driven Development framework to define test scripts and run ETL workflows using Gherkin feature files and Ruby.
- Learn and understand a broad range of multiple data sources and leverage them as needed.
- Scale data processes and reports, lead work with data engineering for full scale automation.
- Excellent writing skills, to create artifacts easily digestible by business and tech partners.
- To support this initiative, you must be able to think strategically, perform deep analytics, architect solutions, effectively use AWS data tools and communicate effectively with senior leadership.
Basic Qualifications:
- Bachelor's degree in Computer Science
- 5+ years of progressive experience in data analytics and ETL.
- Experience working with large-scale analytics projects with BI solutions using AWS or similar data tools and technologies.
- Expert-level skills in writing and optimizing SQL and Python to handle extremely large datasets.
- Experience in scripting languages for parsing and data analysis.
- Expert skills in modeling data using data warehousing tools.
- Proficient and direct experience with one or more major data visualization tools including Tableau/QuickSight/Power BI etc.