Data Engineering Specialist (100% Remote) (ETL (Talend)/SQL/Python based processes, Big Data – Spark/Kafka) - Contract
- Job Title
- Data Engineering Specialist (100% Remote) (ETL (Talend)/SQL/Python based processes, Big Data – Spark/Kafka) - Contract
- Job ID
- 27625443
- Work From Home
- Yes
- Location
- Ann Arbor, MI 48106
- Other Location
- Description
-
Title: Data Engineering Specialist (100% Remote) (ETL (Talend)/SQL/Python based processes, Big Data – Spark/Kafka)
Our History:
From our start in 2009, Conexess has established itself in 3 markets, employing nearly 200+ individuals nation-wide. Operating in over 15 states, our client base ranges from Fortune 500/1000 companies, to mid-small range companies. For the majority of the mid-small range companies, we are exclusively used due to our outstanding staffing track record
Who We Are:
Conexess is a full-service staffing firm offering contract, contract-to hire, and direct placements. We have a wide range of recruiting capabilities extending from help desk technicians to CIOs. We are also capable of offering project based work.
GENERAL RESPONSIBILITIES- Design and develop ETL (Talend) / SQL / Python based processes to perform complex data transformation processes.
- Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms (RDBMS and No-SQL platforms)
- Build Data Integration solutions to handle batch / streaming / IoT data on ETL, Big-Data platforms.
- Develop and Deliver changes in the Enterprise Data Warehouse according to Data Warehousing best practices
- Gather requirements and construct documentation to aid in maintenance and code reuse in accordance with team processes and standards
- Design, code, and test major data processing features, as well as work jointly with other team members to provide complex software enhancements for the enterprise data storage platforms
- Monitor scheduled jobs and improve reliability of ongoing processing
- Monitor, measure, and enhance ways to improve system performance
- Ability to multi-task deliverables and manage them in an efficient manner.
- Performs other duties as assigned
- Understanding and Hands-on experience with key technologies (SQL, ETL, Data modeling, data processing)
- Development experience in a Big-Data environment a Plus; Spark, Kafka, Message Queues
- 1 – 2 yrs. Hands-on experience with ETL tools
- Strong SQL skills with Relational DBMS technology - SQL Server a plus
- Experience with shell scripting / Python
- Good understanding and expertise in relational database concepts and data processing concepts
- Experience in handling multiple data formats (Delimited file, JSON, XML etc)
- Hands on experience designing and implementing data ingestion techniques for real time processes (IoT, eCommerce) a plus.
- Strong communication skills (oral and written)
- Good analytical and problem solving skills
- Knowledge of CRM, MDM, and Business Intelligence a plus
- Candidate must be thorough and detail-oriented
- Able to work on multiple priorities in a deadline-driven environment