Career Opportunities

Share This Job

Sr. Solutions Architect (Hadoop, Big Data, BI/DW Experience is a Must)

Job Title
Sr. Solutions Architect (Hadoop, Big Data, BI/DW Experience is a Must)
Job ID
Ann Arbor,  MI 48104
Other Location

Title: Sr. Solutions Architect (Hadoop, Big Data, BI/DW Experience is a Must) 

Our History:
From our start in 2009, Conexess has established itself in 3 markets, employing nearly 150+ individuals nation-wide. Operating in over 15 states, our client base ranges from Fortune 500/1000 companies, to mid-small range companies. For the majority of the mid-small range companies, we are exclusively used due to our outstanding staffing track record

Who We Are:
Conexess is a full-service staffing firm offering contract, contract-to hire, and direct placements. We have a wide range of recruiting capabilities extending from help desk technicians to CIOs. We are also capable of offering project based work.


Conexess is looking for a well-qualified Solutions Architect with background in Big Data, BI, and Data Warehousing.   This position will provide technical leadership around the our clients Enterprise Information Management platform, focusing on large volume information ingestion, storage and analysis.  Concentration will be on building and/or optimizing information models, physical data layouts, configuration, optimization and monitoring of RDBMS and Hadoop environments and overall improving processing efficiencies to support the needs of the business.


  • Lead the design and development of highly scalable and optimized data models, by utilizing modeling software to document and maintain versions to support Data Marts, Cubes, Data Warehouse, and Operational Data Stores (ODS).
  • Establish data standards in terms of nomenclature, storage, design and deployments.
  • Work in concert with a team of ETL developers to ensure efficient and accurate data transfer within the entire EDW echo system with Big Data Platforms.
  • Assure optimized source system replication models and operations.
  • Lead design and maintenance of enterprise meta-data solution to communicate data definitions to the BI audience.
  • Acts as a DW liaison to our Infrastructure Engineering teammates and coordinates initiatives with the other database administration groups in that sister team.
  • Ensures appropriate technical standards and procedures are defined. Manages the development of centers of excellence around key storage sub-system technologies.
    • Collaborate in planning initiatives in Application Development, System Architecture, Future Roadmaps, Operations and Strategic Planning   
    • Work with business teams and technical analysts to understand business requirements. Determine how to leverage technology to create solutions that satisfy the business requirements.
    • Present solutions to the business, project teams, and other stakeholders with the ability to speak technical and non-technical   
    • Create architecture and technical design documents to communicate solutions that will be implemented by the development team.   
    • Work with development, infrastructure, test, and production support teams to ensure proper implementation of solution.  
    • Ability to assess the impact of new requirements on an existing suite of complex applications   
    • Educate organization on available and emerging toolsets.  
    • Drive the evolution of infrastructure, processes, products, and services by convincing decision makers  
    • Develop proofs-of-concept and prototypes to help illustrate approaches to technology and business problems. Experience in building Business Intelligence platforms in an enterprise environment. Data integration (batch, micro-batches, real-time data streaming) across Hadoop, RDMSs, and Data warehousing (SQL Server 2016 preferred)
    • Build real-time data pipelines using technologies such as Apache Kafka, Spark, Storm, and Flume etc.
    • Analyze data using technologies such as Python, R, Scala, Pig, and Hive etc.
    • Build consumption frameworks on Hadoop (Restful services, Self-service BI and Analytics)
    • Optimize Hadoop environment using MapReduce, Spark and HDFS footprints Hadoop security, Data management and Governance


  • Bachelor's degree in computer science, information technology, engineering, business administration or related field.
  • Knowledge in HADOOP required
  • Understands the capabilities of key technologies (Data modeling, data processing, BI analytics) and can quickly assess the applicability of commercial off the shelf technology.
  • Excellent grasp of integrating multiple data sources into an enterprise data management platform and can lead data storage solution design.
  • Strong communication skills (oral and written).
  • Good analytical and problem solving skills.
  • Understanding of the software development lifecycle including agile methodology.
  • Ability to understand business requirements and building pragmatic/cost effective solutions using Agile project methodologies
  • Ability to collaborate with business users to understand requirements Excellent problem solving and analytical skills
  • Minimum of 8-10 years enterprise IT application experience that includes at least 3 years of architecting strategic, scalable BI, Bigdata solutions
  • 6 to 8 years’ experience with Relational DBMS technology - SQL Server focused.
  • 6 to 8 years’ experience in developing procedures, packages and functions in DW environment.
  • Deep experience in ANSI SQL, Stored Procedures;
  • 2+ years’ experience with Map Reduce, Pig, Hive-QL Hadoop languages a plus.

Option 1: Create a New Profile