Lead Data Integration Specialist (Strong DW/ETL Experience and some realtime/kafka/spark/python exp.)
- Job Title
- Lead Data Integration Specialist (Strong DW/ETL Experience and some realtime/kafka/spark/python exp.)
- Job ID
- Ann Arbor, MI 48106
- Other Location
Title: Lead Data Integration Specialist (Strong DW/ETL Experience and some realtime/kafka/spark/python exp.)
From our start in 2009, Conexess has established itself in 3 markets, employing nearly 200+ individuals nation-wide. Operating in over 15 states, our client base ranges from Fortune 500/1000 companies, to mid-small range companies. For the majority of the mid-small range companies, we are exclusively used due to our outstanding staffing track record
Who We Are:
Conexess is a full-service staffing firm offering contract, contract-to hire, and direct placements. We have a wide range of recruiting capabilities extending from help desk technicians to CIOs. We are also capable of offering project based work.
Additional Job Details
This position will provide technical leadership as a lead data integration & engineering developer who focuses on large volume information ingestion and transformation. This position is responsible for modeling data structures and orchestrating data interfaces into (and out of) our Enterprise Data Warehouse.
- Design and develop ETL (Talend) / SQL / Python based processes to perform complex data transformation processes.
- Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms (RDBMS and No-SQL platforms)
- Build Data Integration solutions to handle batch / streaming / IoT data on ETL, Big-Data platforms.
- Manage changes in the Enterprise Data Warehouse according to Data Warehousing best practices
- Provides thought leadership to deliver creative and efficient data related technical solutions
- Gather requirements and construct documentation to aid in maintenance and code reuse in accordance with team processes and standards
- Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms
- Monitor scheduled jobs and improve reliability of ongoing processing
- Monitor, measure, and enhance ways to improve system performance
- Must be able to remain calm in pressure situations and adapt quickly to change
- Must have the ability to work independently, with minimal supervision
- Performs other duties as assigned
- Strong Understanding and Hands-on experience with key technologies (SQL, ETL, Data modeling, data processing) and can quickly assess the applicability of commercial off the shelf technology
- Understands, can participate in, and can lead data integration solution design
- Strong understanding and expertise in relational database concepts and data processing concepts
- Hands on experience designing and implementing data ingestion techniques for real time processes (IoT, eCommerce).
- Development experience using Big data technologies; Spark, Kafka, Message Queues
- Extensive Hands-on experience with ETL tools (Talend is a plus)
- Strong SQL skills with Relational DBMS technology - SQL Server a plus
- Experience with shell / Python scripting
- Strong communication skills (oral and written)
- Good analytical and problem solving skills
- Knowledge of CRM, MDM, and Business Intelligence a plus
- Candidate must be thorough and detail-oriented
- Able to work on multiple priorities in a deadline-driven environment