- Job Title
- Data Architect
- Job ID
- Ann Arbor, MI 48103
- Other Location
Essential Duties and Responsibilities:
As the Sr. Data Architect, you will be responsible for articulating and driving aspects of our real-time stream messaging data strategy, as well as developing and validating the architecture and solution design aligned to critical business capabilities. You will provide technical expertise, guide, and mentor development teams on methodologies, processes and best practices on cloud data architectures, data science platforms, event-driven architecture, and API-enabled data services in the mortgage finance business domain.
· Lead, inspire, and influence to make sure your team is successful. You will provide leadership, technical direction, and oversight to multiple software engineering teams.
· Ensure a consistency of integrated technology design and practices used across development teams.
· Contribute to the technology strategy, architectural vision, integration, and problem solving across the platform teams.
· Strong desire to understand the root cause and details of systems, get hands-on with code, data, and analysis to evaluate how the team and the product are growing.
Knowledge, Skills, and Abilities:
· Ability to think tactically and strategically.
· Able to think in terms of functional sustainability rather than just short-term wins.
· Coaching and mentoring engineers.
· Ability to lead team learning events.
· Ability to anticipate and avoid operational problems.
· Confident in a leadership role. Respectful of others, accepting, and leads with a We Care attitude.
· Articulate communicator and effective listener at all layers.
· Cross-functional team building. Build relationships to improve results.
· Understands the greater good. Ability to develop win/win solutions.
· Intellectual curiosity, with an open-mind and tolerant of difference.
· Willingness to experiment and try new methods.
· Experience in implementing innovation without disruption.
· Process improvement skills.
· Ability to disseminate and enforce best practices and policy.
· Balanced judgement of impact and cost of programs.
· Ability to drive a culture of continuous improvement.
Education and/or Work Experience Requirements:
· Bachelor’s in Computer Science, Information Technology, or related field.
· Relational Database Administrator experience (SQL Server, Azure SQL) particularly experience in index optimization, merge processing optimization, database maintenance (index maintenance, statistics maintenance, DBCC CHECKDB), deadlock resolution, and database user and role security.
· 12+ years in a software development discipline, and a proven track record of delivering solutions with increasing scope of responsibilities and business impact.
· Demonstrated experience developing solutions that include: Service-oriented and Event-driven architectures. Data stream processing experience is required.
· Passionate about integration and platforms with a strong desire for market leadership.
· Proven experience working with distributed engineering teams. Technical knowledge to provide credible hands-on guidance to highly qualified engineers.
· Hands on experience in application development, CI/CD, DevOps principles, frameworks, and tools to simplify development via agile methodologies. Knack for delivering business value iteratively to customers in a continuous fashion.
· Experience running mission critical systems at scale in the cloud.
· Unafraid to try new technologies, arbitrate complex technical discussions, and make challenging decisions.
· Great written and verbal communication skills, including ability to present ideas to both technical and business audiences.
· A self-starter who can thrive in ambiguity and builds trusted relationships. Has a “can do” attitude and entrepreneurial resourcefulness in getting things done in a fast-paced environment.
Required Technical Skills:
· Good understanding of logical and physical relational database design which includes.
o Database normalization.
o Standard modeling patterns (Silverston, Hay, Blaha).
· Good understanding of NoSQL modeling patterns.
o Document database modeling.
o Key-Value database modeling.
o Event streaming database modeling.
o Graph database.
o Column family store.
· Good understanding of Stream Processing architecture. (Kafka experience is preferred)
· Understanding\openness to a polyglot persistence architecture.
· Microservices familiarity.#LI-JO1